Category: Work

  • When Development Meets Readiness

    When Development Meets Readiness

    One of the quiet tensions in organisations rarely appears in strategy documents or leadership frameworks – It shows up in everyday work.

    A junior colleague asks to take on more “special projects.” > A manager encourages them to try. Meanwhile, the operational workload continues to grow.

    In that moment, a question naturally arises for those of us managing work on the ground:

    Are we assigning work based on readiness… or development?

    The answer, more often than not, is both, and that is where the complexity begins.

    The Development vs Readiness Dilemma

    Organisations must constantly balance two priorities:

    • Delivering outcomes today
    • Developing people for tomorrow

    If every important task is given only to the most experienced person, the organisation may achieve short-term efficiency but fail to build future capability. But if responsibilities are given too quickly to people who are not ready, delivery risks increase.

    Leaders often resolve this tension with a simple phrase:

    “Let them try.”

    On the surface, this decision can feel uncomfortable to those responsible for ensuring work gets done properly. Operational gaps may appear obvious. Planning may seem incomplete. Communication may require refinement. Yet development rarely happens in perfectly controlled environments. It happens in real situations, with real responsibility, and sometimes growth begins exactly where readiness feels uncertain.

    The Middle Management Perspective

    For those of us in middle management, this tension becomes especially visible.

    We sit between leadership’s intention and operational reality. We understand why leaders want to create opportunities for emerging talent. At the same time, we see the practical implications on workload, timelines, and team coordination.

    When a junior team member requests to focus on special projects while reducing their business-as-usual responsibilities, it can raise a difficult question:

    Who carries the operational load while development happens?

    This is not simply a matter of fairness. It is a matter of capacity management.

    In many teams, operational work is where discipline, consistency, and accountability are built. It is the foundation that enables people to later handle complex projects. Without that foundation, projects risk becoming ideas without execution.

    Leadership Styles and Expectations

    Another dynamic often appears in these situations: differences in leadership style. Some leaders operate with a highly directive approach. They think through problems, prepare materials, and provide clear instructions for the team to execute.

    This approach can be extremely effective for junior teams because it provides certainty and structure. Other leaders operate differently. Instead of providing answers, they expect team members to propose solutions, draft plans, and think through problems independently.

    The intention is to build ownership and professional maturity.

    However, when a team has been used to directive leadership, this shift can feel uncomfortable.

    Questions such as:

    • “What should we do?”
    • “Can you tell us the steps?”
    • “Can you prepare the structure first?”

    begin to surface.

    In these moments, the gap is not simply about competence. It is about expectations of leadership.

    Ownership vs Direction

    One of the most important transitions in professional growth is moving from waiting for instructions to owning outcomes. Yet this transition rarely happens automatically.

    Junior professionals often equate supportive leadership with receiving clear answers and step-by-step guidance. When asked to think independently, they may feel uncertain or even unsupported.

    This is where leadership must strike a careful balance.

    • Providing no structure can feel like abandonment.
    • Providing all the answers prevents growth.

    A more effective approach is what some leaders call structured ownership, means instead of solving the problem for the team, the leader provides a framework for thinking.

    For example: “Prepare a proposal with the objective, timeline, and key steps. Then we review together.”

    This approach maintains support while keeping responsibility where it belongs.

    The Quiet Role of Professional Restraint

    In environments where development and readiness are being balanced, experienced professionals play an important stabilising role. Not every gap needs immediate correction. Not every disagreement requires escalation.

    Sometimes the most valuable contribution is professional composure.

    • Stepping back when appropriate.
    • Providing structure without taking over.
    • Allowing others to experience responsibility.

    This does not mean lowering standards. It means recognising that growth often involves discomfort, and that organisations develop people not only through perfect execution, but through experience.

    A Reflection from the Middle

    Working in the middle of an organisation often means navigating tensions that are not immediately visible.

    We see the operational realities. We understand leadership’s intentions, and we often feel responsible for ensuring both can coexist.

    In these moments, the question is not simply who is right. The question is how organisations can continue delivering results while still creating space for people to grow. Because in the long run, strong organisations are not built only by those who are already ready.

    They are built by leaders who know when to say: “Let them try”, and by professionals who understand how to support that process responsibly.

  • Living With Imperfect Systems

    Living With Imperfect Systems

    Digital transformation is often presented as a story of efficiency. New systems promise automation, seamless integration, and the elimination of manual work. In theory, once technology is in place, processes should become smoother and data should flow cleanly across platforms.

    In practice, the reality is far more complicated.

    Many organisations operate with systems that were designed at different times, for different purposes, and by different teams. Over the years, new platforms are layered onto older ones. A learning system may depend on data from a student management system. Reporting tools may depend on both. Each system works well within its own boundaries, but the moment data needs to move across systems, inconsistencies begin to appear.

    Recently, I encountered one such situation while validating records between two institutional platforms. What initially looked like a small discrepancy turned out to be a deeper issue involving identity records across systems.

    The discussion that followed led to a familiar conclusion.

    The system would remain as it is.

    And the discrepancies would be managed manually.

    At first glance, this outcome raises an uncomfortable question: is this really an efficient way to work?

    The Expectation of Perfect Systems

    Professionals working close to systems often approach problems with a particular mindset. When a discrepancy appears, the instinct is to investigate the root cause, understand the structural issue, and fix the system so the problem does not recur.

    This instinct comes from a governance perspective. Systems should be reliable. Data should be consistent. Processes should not depend on constant manual correction.

    In an ideal environment, the solution to system discrepancies would be straightforward: adjust the structure, align the data rules, and ensure that the problem cannot happen again.

    But organisations rarely operate in ideal conditions.

    Most institutional systems evolve gradually. A student management system may have been implemented years ago. Other platforms are introduced later to support new functions. Each system carries its own design assumptions. Over time, the connections between them become more complex.

    When discrepancies surface, fixing the issue at its source may require much more than a small technical adjustment.

    It may require redesigning the entire structure of how the systems interact.

    Why System Changes Are Not Always Immediate

    From a purely technical standpoint, correcting system discrepancies often makes sense. However, system changes rarely exist in isolation.

    Institutional systems sit at the intersection of multiple functions. A change in one area can affect academic records, financial records, compliance reporting, or regulatory requirements. What appears to be a simple data rule may have implications far beyond the original system.

    Because of this, organisations sometimes choose a different path.

    Instead of redesigning the system immediately, they decide to stabilise operations and manage exceptions manually until a larger system change becomes feasible.

    This approach may not be elegant, but it is often pragmatic.

    System redesign requires time, resources, and coordination across multiple departments. If a major platform upgrade is already planned in the future, organisations may prefer to maintain the current structure temporarily rather than introduce changes that will soon be replaced.

    In such situations, operational teams are asked to work within the limitations of the existing system.

    The Discomfort of Manual Workarounds

    For people who care deeply about systems and processes, this decision can feel frustrating.

    Manual workarounds introduce inefficiency. They require additional checks, additional communication, and additional documentation. Instead of eliminating errors, the organisation now depends on people to catch and correct them.

    From a process improvement perspective, this is far from ideal.

    Manual processes increase operational risk. They rely on human vigilance, which is never perfect. They also consume time that could otherwise be spent on more strategic work.

    It is therefore natural to ask whether these workarounds are simply excuses for poor system design or operational complacency.

    In some cases, that concern may be justified. If organisations ignore system problems entirely and allow manual corrections to become the default solution indefinitely, inefficiency becomes embedded into everyday operations.

    But not every workaround reflects incompetence.

    Sometimes it reflects constraint.

    Constraint Management Versus Incompetence

    There is an important difference between incompetence and constraint management.

    Incompetence occurs when organisations ignore problems, fail to document processes, and repeatedly encounter the same issues without learning from them. Constraint management, on the other hand, acknowledges the problem but recognises that the system cannot be changed immediately. Instead, the organisation introduces structured processes to manage the limitation while preparing for a future solution.

    The difference lies in discipline.

    When manual workarounds are handled carefully—with documentation, clear procedures, and accountability—they become a temporary operational bridge rather than a permanent weakness.

    This distinction is important because it shapes how teams respond to system limitations.

    If the workaround is chaotic, frustration grows quickly. If it is structured, teams can continue operating while the organisation prepares for larger infrastructure changes.

    Governance Within Imperfect Systems

    Working with imperfect systems does not mean abandoning governance. In fact, governance becomes even more important when systems cannot enforce consistency automatically.

    Where systems fall short, processes must compensate.

    This means establishing clear internal guidelines for handling discrepancies. Teams need to understand how identity conflicts should be resolved, how records should be verified, and how manual corrections should be documented.

    These steps may appear administrative, but they serve an essential purpose. They preserve transparency.

    If questions arise later about how a record was handled or why a discrepancy occurred, the organisation can trace the decision-making process.

    Without such documentation, manual corrections quickly become invisible, and institutional memory fades.

    Strong governance ensures that even temporary solutions remain accountable.

    Leadership in Imperfect Systems

    Situations like this also reveal an important leadership challenge.

    Professionals who work closely with systems often see structural issues before others do. Their responsibility is to raise these concerns and highlight potential risks.

    However, leadership decisions are rarely based on technical logic alone.

    Leaders must also weigh organisational priorities: stability, resource allocation, cross-department relationships, and long-term system plans. Sometimes the decision is not to fix the system immediately, but to contain the issue until a larger transformation is possible.

    Accepting that decision requires a shift in perspective.

    The role of operational teams then becomes ensuring that the temporary solution remains controlled and sustainable.

    This does not mean ignoring the original problem. It means managing it responsibly until the organisation is ready for structural change.

    Learning From Imperfect Systems

    Ironically, imperfect systems often teach organisations valuable lessons.

    Discrepancies reveal hidden assumptions in system design. They expose gaps between departments. They highlight where governance needs strengthening.

    When these lessons are documented, they become useful input for future system improvements.

    If a new platform is eventually introduced, the organisation will already have a clearer understanding of where the previous system struggled.

    In this way, today’s operational challenges become tomorrow’s institutional knowledge.

    A Different Kind of Efficiency

    Returning to the original question—whether manual workarounds are efficient—the answer remains complex.

    From a purely operational perspective, they are not.

    Manual intervention consumes time and introduces risk. Automated systems will always be more efficient when they function correctly.

    However, efficiency must also be considered within organisational context.

    If redesigning a system today would create greater disruption than maintaining it temporarily, leaders may decide that stability is the more responsible choice.

    In such cases, the goal shifts from perfect efficiency to controlled continuity.

    The challenge then is not eliminating the workaround entirely, but ensuring it is managed with clarity and discipline.

    Pragmatism and Responsibility

    Digital transformation narratives often celebrate innovation and automation. Yet much of the real work inside organisations involves navigating imperfect systems with professionalism and care.

    Operating within constraints does not mean lowering standards. It means recognising the difference between what is technically ideal and what is organisationally feasible at a given moment.

    Responsible governance lies in bridging that gap.

    Systems will evolve. Platforms will eventually be replaced. New technologies will promise cleaner integration and better data structures.

    Until then, organisations must continue functioning.

    And sometimes the most responsible form of leadership is not insisting on immediate perfection, but managing imperfect systems with transparency, discipline, and pragmatic judgment.

  • When Learning Becomes Self-Paced, How Should Universities Measure Learning?

    When Learning Becomes Self-Paced, How Should Universities Measure Learning?

    Over the past two decades, the landscape of higher education has gradually shifted. Universities that once relied almost entirely on face-to-face lectures now operate in environments shaped by digital platforms, learning management systems, and increasingly flexible learning pathways. Students today often encounter course materials before class, revisit them after teaching sessions, and sometimes complete entire segments of learning independently.

    In this environment, learning is no longer confined to the lecture hall. It unfolds across multiple spaces: course materials, discussion forums, recorded lectures, and independent study. The experience of learning has become more distributed, and in many cases more self-paced.

    This evolution raises an important question for universities: if learning increasingly occurs through self-directed engagement with materials and digital environments, how should institutions measure whether learning is truly taking place?

    Traditionally, universities have relied on familiar indicators such as examination results, assignment grades, and course completion rates. These remain important measures of academic performance. However, the shift toward self-paced learning introduces new layers to the learning process—layers that may not be fully captured by traditional evaluation methods alone.

    Understanding how learning unfolds in this new environment requires a broader perspective on how teaching, learning materials, and student engagement interact within the academic system.

    From Teaching Events to Learning Environments

    In the traditional university model, teaching was largely organised around scheduled events. Lectures, tutorials, and seminars provided the primary spaces where learning occurred. Students attended classes, listened to explanations, asked questions, and engaged in discussions with their lecturers and peers.

    In such environments, the lecturer played a central role in guiding the learning process. Much of the instructional support happened during teaching sessions. Lecturers clarified difficult ideas, provided examples, and responded to students’ questions in real time.

    Because teaching was highly visible, learning effectiveness was often inferred through observable outcomes. Examination results and assessment grades served as indicators that students had achieved the expected learning outcomes.

    However, as learning environments expanded beyond the classroom, the dynamics of teaching and learning began to change.

    Today, students frequently engage with course materials through learning management systems. They read instructional documents, watch recorded lectures, participate in online discussions, and complete activities independently before or after formal teaching sessions.

    Learning has therefore become less tied to specific teaching events and more embedded within a broader learning environment.

    The Rise of Self-Paced Learning

    Self-paced learning does not mean that students learn without guidance. Rather, it means that the rhythm of learning is no longer entirely dictated by classroom schedules. Students may spend time reviewing course materials at different moments, revisiting complex ideas, or progressing through learning activities according to their own pace.

    Instructional materials play a much more significant role in this environment. Course documents, digital modules, and learning resources often become the primary guides through which students encounter new knowledge.

    In such settings, the learning process unfolds gradually through multiple forms of interaction: reading, reflection, discussion, and practice. The lecturer remains important, but the learning experience is no longer confined to direct instruction.

    This shift inevitably raises questions about how learning should be evaluated.

    If learning occurs across materials, digital interactions, and independent study, measuring learning effectiveness becomes more complex.

    The Indicators Universities Already Measure

    Most universities already collect significant amounts of data related to teaching and learning. In digital learning environments, learning management systems provide various indicators of student activity and engagement.

    For example, institutions may monitor:

    • how frequently students access course materials
    • how actively students participate in online discussions
    • how often lecturers interact with students within the platform
    • how quickly students’ progress through course modules
    • how students perform in graded assessments

    Each of these indicators provides a different perspective on the learning process.

    Engagement metrics can reveal whether students are interacting with course content. Instructor activity may indicate the level of teaching presence within the course. Assessment results demonstrate how well students perform when evaluated.

    From a system perspective, these indicators appear to offer a rich picture of teaching and learning.

    Yet in practice, these measures are often interpreted separately.

    The Fragmentation of Learning Indicators

    One challenge in evaluating learning effectiveness is that different indicators are frequently treated as independent measurements.

    Engagement analytics may be used to monitor student participation in the learning platform. Instructor interaction metrics may be used to evaluate teaching presence. Assessment results measure academic achievement. Completion rates indicate whether students finish their courses.

    Each of these indicators provides useful information, but they rarely form part of a unified interpretation of the learning process.

    For example, a course may show high levels of student login activity, but this does not necessarily indicate deep engagement with the material. A lecturer may post frequently in discussion forums, but the volume of interaction does not always reveal whether meaningful learning conversations are taking place.

    Similarly, strong assessment results may reflect effective learning—or they may simply indicate that students have adapted well to the structure of the assessment itself.

    When these indicators are analysed separately, institutions see fragments of the learning experience rather than the full learning journey.

    Teaching Effort and Instructor Presence

    One area where this fragmentation becomes visible is in the measurement of instructor engagement within digital learning environments.

    Many institutions monitor how actively lecturers participate in the course platform. Indicators such as the number of posts, announcements, or responses to students are sometimes used as signals of teaching effort.

    These metrics can be useful. In self-paced learning environments, instructor presence helps students feel supported and connected to the course. When lecturers respond to questions, initiate discussions, or provide feedback, they help sustain the learning environment.

    However, the quantity of instructor interaction does not always reflect the quality of teaching engagement. A lecturer may post frequently without necessarily stimulating deeper thinking among students. Conversely, a lecturer who interacts less often may design activities that generate meaningful peer discussion and reflection.

    Seen in isolation, instructor engagement metrics therefore provide only a partial picture of teaching effectiveness.

    To understand learning more fully, instructor activity must be considered alongside student interaction and learning outcomes.

    Learning as a Continuum

    Rather than viewing engagement, instructor interaction, and academic performance as separate indicators, it may be more helpful to see them as stages within a single learning continuum.

    Learning often unfolds through a sequence of interconnected experiences.

    Students first encounter course materials and become exposed to new ideas. They engage with the content by reading, watching, or listening. Interaction with lecturers or peers may help them clarify their understanding and explore different perspectives. Through practice and discussion, they begin to apply what they have learned. Finally, they demonstrate their understanding through assessments and projects.

    From this perspective, engagement metrics, instructor activity, and assessment outcomes are not separate phenomena. They represent different signals along the same learning journey.

    For example, a student’s interaction with course materials may lead to participation in discussions. Those discussions may deepen understanding, which then influences performance in assignments and examinations.

    When these signals are examined together rather than independently, they begin to reveal how learning actually unfolds within the course environment.

    A Systems Perspective on Learning Effectiveness

    Viewing learning through a systems perspective does not require universities to abandon existing evaluation methods. Assessment results, completion rates, and engagement analytics all remain valuable sources of information.

    What may need to evolve is how these indicators are interpreted.

    Instead of treating them as separate metrics, institutions might begin to examine how they relate to one another. Patterns between instructor engagement, student participation, and assessment performance may reveal deeper insights about the effectiveness of the learning environment.

    For example, courses where instructor interaction stimulates meaningful student discussion may also show stronger conceptual understanding in assessments. Similarly, patterns of student engagement with course materials may help explain variations in learning outcomes across different cohorts.

    Understanding these relationships requires moving beyond isolated indicators toward a more integrated view of how learning operates within the institutional system.

    The Next Stage of Learning Evaluation

    As universities continue to adopt flexible and digital learning models, the evaluation of learning may need to evolve alongside these changes.

    Traditional measures of academic performance will remain essential. However, they may increasingly be complemented by insights drawn from learning analytics and engagement patterns within digital platforms.

    The challenge for institutions is not merely to collect more data, but to interpret existing signals more meaningfully.

    When engagement, instructor presence, and academic performance are understood as connected parts of the learning journey, universities gain a clearer understanding of how teaching practices, instructional materials, and student behaviours interact within the learning environment.

    Conclusion: Measuring Learning in an Evolving System

    The shift toward self-paced and digitally supported learning environments represents an important evolution in higher education. As teaching expands beyond the lecture hall, the ways in which universities understand and evaluate learning must also adapt.

    Students today learn through a combination of instructional materials, digital interactions, and guided teaching. Their learning journeys unfold across multiple spaces rather than within a single classroom event.

    In such environments, measuring learning effectiveness requires more than examining isolated indicators of engagement or performance. It requires recognising that these indicators are interconnected signals within a broader learning system.

    Seen from this perspective, the evaluation of learning becomes not simply a matter of measuring outcomes, but of understanding how learning unfolds across the institutional environment.

    As universities continue to explore new models of teaching and learning, this systems perspective may offer a more nuanced way of understanding what it means for learning to truly take place.

  • Rethinking the Structure of Self-Instructional Materials

    Rethinking the Structure of Self-Instructional Materials

    In Malaysian higher education, Self-Instructional Materials (SIM) have become a familiar component of course design and programme documentation. Universities prepare these materials for each course, upload them to learning platforms, and present them during programme reviews or accreditation exercises. On the surface, the system appears well structured. Learning outcomes are stated, topics are organised, and supporting materials are compiled for students.

    The broader context for these practices is shaped by the Malaysian Qualifications Agency (MQA), the national body responsible for assuring the quality and standards of tertiary education in Malaysia. MQA frameworks emphasise student-centred learning, constructive alignment between outcomes and assessments, and the importance of learning resources that support independent study.

    Within this context, SIM is intended to function as a learning resource that helps students engage with course material beyond the classroom.

    Yet when one looks more closely at how SIM appears in practice, an interesting question begins to emerge.

    “Are these materials truly instructional in nature, or are they primarily informational?

    This question is not meant as criticism. Rather, it reflects a growing awareness that the presence of content does not always mean the presence of instructional design. Many SIM documents are academically rich and comprehensive, yet the pathway through which students develop understanding is not always visible within the material itself.

    Understanding this distinction requires examining both the structure of SIM and the role of teaching within the university environment.

    The Current Shape of SIM

    In many institutions, SIM resembles an expanded set of lecture notes. The document typically begins with the course learning outcomes, followed by a sequence of topics organised week by week. Each topic contains explanations of concepts, theoretical discussions, diagrams, and recommended readings. Towards the end, students encounter self-practice, check-points, assignments or assessment tasks designed to evaluate their understanding.

    From an academic perspective, this structure makes sense. It demonstrates that the course content has been carefully developed and that the key concepts are presented in a logical order. It also ensures that important theories, frameworks, and concepts are properly documented.

    However, the internal logic of this format is primarily informational. The document answers the question: What information should students receive?

    The structure often follows a familiar pattern. A concept is introduced, the theory is explained, and the next concept follows. Learning is assumed to occur as students read through the material and attend lectures that accompany it.

    For many years, this approach functioned effectively because the classroom itself carried much of the instructional work.

    The Lecturer as the Instructional Guide

    It is important to recognise that SIM has traditionally existed alongside teaching rather than replacing it. In most universities, lecturers remain central to the learning process. They interpret the material, illustrate concepts through examples, and respond to questions that arise during class discussions.

    Much of the instructional guidance that students receive comes from the lecturer’s explanation rather than from the written material itself. Lecturers demonstrate how theories apply to real situations, guide students through difficult ideas, and clarify misunderstandings as they emerge.

    In this sense, the lecturer animates the material.

    For many years, SIM functioned primarily as a reference document that supported classroom teaching. The lecturer provided the instructional scaffolding while the material documented the content of the course.

    This arrangement worked well in traditional face-to-face environments. However, as learning environments evolve, the relationship between teaching and learning materials begins to shift.

    Changing Learning Environments

    Today, universities operate in increasingly complex learning environments. Courses may be delivered across multiple campuses, involve different teaching teams, or combine face-to-face sessions with online learning. In some cases, students engage with course materials independently before meeting their lecturers.

    In these contexts, SIM begins to play a larger role in the learning process. The document is no longer only a reference for lectures; it becomes part of the learning pathway itself. It is the guide that supports students as they study outside the classroom.

    Students rely on it not only to understand what is taught but also to structure their learning between teaching sessions. When this happens, the internal design of SIM becomes more important. The document must help students navigate the learning process rather than simply present information.

    This is where the difference between informational and instructional materials becomes more visible.

    Informational Materials: Knowledge as Coverage

    Informational materials are organised around subject matter completeness. Their primary objective is to ensure that students are exposed to the necessary theories, models, and frameworks within a discipline.

    The document reflects the intellectual structure of the subject itself. Concepts are explained in depth, theoretical debates are introduced, and readings are provided to extend understanding.

    This approach has clear strengths. It respects disciplinary knowledge and ensures that students encounter the intellectual foundations of their field. Academic depth is preserved.

    However, informational materials do not always make the learning process explicit. They present knowledge but may not demonstrate how students should move from understanding concepts to applying or evaluating them.

    Students are often expected to make these connections independently.

    When lecturers guide the process through discussion and explanation, the system works well. But when students rely heavily on the material itself, informational structure can leave important steps in the learning process implicit rather than visible.

    Instructional Materials: Knowledge as a Learning Journey

    Instructional materials are structured differently. Instead of focusing primarily on content coverage, they focus on the progression of understanding.

    Concepts are introduced in ways that anticipate how learners engage with them. Examples are used to illustrate how ideas are applied in practice. Short activities or reflective prompts allow students to test their understanding before moving to more complex tasks.

    The document therefore functions as a learning guide rather than only a content repository.

    In this structure, the relationship between learning outcomes, activities, and assessments becomes clearer. Students can see how each section of the material prepares them for the next stage of the course.

    Importantly, instructional structure does not reduce academic depth. Rather, it supports comprehension by helping students navigate complex ideas more gradually.

    Academic Expertise and Learning Design

    Another dimension of this discussion lies in how academics are trained. Most lecturers develop their expertise through years of disciplinary study and research. Their professional identity is shaped by deep engagement with a specific field of knowledge.

    As a result, their intellectual training emphasises depth. Scholars explore concepts thoroughly, analyse theories critically, and contribute new insights to their discipline. This depth of knowledge is essential to the university.

    However, the skills required to design learning materials are somewhat different. Instructional design asks a different set of questions. Instead of focusing primarily on what must be explained, it asks how learners will gradually come to understand and apply those ideas.

    It requires anticipating where students might struggle, sequencing concepts carefully, and providing examples or guided practice that support comprehension.

    These are not always areas in which academics receive formal preparation. Their training prepares them to generate knowledge and engage in scholarly debate rather than to design structured learning pathways.

    As a result, when lecturers develop course materials such as SIM, their instinct is often to present the subject in the most intellectually complete form possible. Concepts are explained in depth, and readings are selected to reflect the richness of the discipline.

    While this strengthens the academic substance of the material, the instructional pathway through that knowledge may remain implicit.

    A Gap Between Intention and Practice

    The role of the Malaysian Qualifications Agency (MQA) provides another lens through which to view this issue. As the governing body overseeing quality assurance in Malaysian higher education, MQA frameworks emphasise student-centred learning, constructive alignment, and learning environments that support independent study.

    These principles suggest that course materials should help guide students through the learning process, not merely present information.

    However, frameworks typically describe these expectations at a conceptual level rather than prescribing a fixed structure for SIM. Institutions are given flexibility in how they translate these principles into practice.

    Within this space of interpretation, an interesting pattern sometimes emerges.

    Universities often translate these principles into documentation processes. Templates are developed, sections are standardised, and materials are compiled to demonstrate that the required components exist. Learning outcomes are written, topic outlines are organised, and readings are listed.

    While these steps fulfil documentation requirements, they do not always translate the original intention of the framework into instructional design. The material may document the curriculum effectively while leaving the learning pathway implicit.

    In this sense, the framework emphasises learning support, while institutional practice sometimes emphasises content documentation.

    A Quiet Opportunity for Reflection

    Recognising this distinction between informational and instructional SIM opens an opportunity for reflection rather than criticism.

    Universities have long been centres of knowledge creation and transmission. Informational materials reflect that tradition. They preserve the intellectual depth and scholarly rigour that define academic disciplines.

    At the same time, contemporary learning environments increasingly require materials that help guide students through complex ideas more deliberately. As courses expand across digital and hybrid settings, the written material itself plays a larger role in shaping the learning experience.

    This does not diminish the role of lecturers. Rather, it invites closer alignment between teaching practices and the design of course materials.

    Lecturers continue to enrich and contextualise learning through discussion, explanation, and mentorship. Instructionally structured materials complement this work by making the learning pathway more visible between teaching sessions.

    Conclusion: From Documentation to Learning Architecture

    The distinction between informational and instructional SIM may appear subtle, but it reflects a deeper shift in how universities approach teaching and learning.

    Informational SIM documents what is taught; Instructional SIM reveals how learning unfolds.

    Both have value. Academic depth remains essential to higher education. Yet as learning environments evolve, the structure of course materials increasingly shapes how students engage with knowledge.

    Seen in this light, the development of SIM is not merely an administrative exercise. It is part of the broader effort to design learning experiences that help students move from encountering ideas to understanding and applying them.

    The question therefore is not whether SIM exists, but how it functions.

    And within that quiet question lies an opportunity for universities to reflect on how knowledge is not only transmitted but also learned.

  • Why Most Digital Transformation Fails Before It Even Begins

    Why Most Digital Transformation Fails Before It Even Begins

    Digital transformation has become one of the most overused phrases in higher education strategy documents. Institutions proudly announce new learning management systems, AI-powered analytics dashboards, student engagement platforms, and digital reporting tools. Yet, beneath the surface of many of these initiatives lies a quiet truth: most digital transformation efforts fail before they even begin—not because the technology is inadequate, but because the underlying architecture, governance, and operational discipline are missing.

    Transformation is not the installation of a system. It is the re-engineering of how information flows, how decisions are made, and how accountability is structured. When institutions skip this foundational work, digital tools become cosmetic upgrades layered on top of structural fragility.

    Digital Tools Without Data Governance: Cosmetic Transformation

    In many universities, the first instinct is to “go digital” by procuring a new platform. The assumption is simple: if we modernise the tool, performance will improve. However, digital tools without data governance merely digitise existing chaos.

    Consider a scenario familiar to many higher education institutions. A university adopts a new course networking platform to enhance student engagement and track learning analytics. The platform offers dashboards, user labels, programme-level segmentation, and performance insights. Yet within weeks of implementation, inconsistencies begin to surface. Student identities do not match across systems. The same email address appears under multiple profiles. Graduating students are enrolled under outdated matriculation numbers. Programme labels are duplicated or misaligned.

    The issue is not the platform. The issue is that no one defined the rules governing data architecture before deployment.

    Data governance is not glamorous. It requires clarity on ownership, naming conventions, validation rules, escalation pathways, and system boundaries. Who owns the master student record? Which system is the source of truth? How are changes version-controlled? Without these answers, digital transformation becomes a patchwork of manual corrections and temporary fixes.

    In such contexts, transformation becomes cosmetic. Reports look sophisticated, but the underlying data cannot be trusted. Decision-makers spend more time questioning accuracy than acting on insights. The institution appears technologically advanced, yet operationally fragile.

    True digital transformation begins not with procurement, but with governance.

    Analytics Dashboards Are Useless Without Clean Architecture

    Higher education leadership increasingly demands dashboards. They want real-time enrolment trends, student engagement metrics, course completion rates, faculty workload analytics, and predictive risk indicators. Vendors promise visual clarity and AI-powered forecasting.

    However, analytics dashboards are only as reliable as the architecture feeding them.

    When data fields are inconsistently labelled, when programme codes differ across campuses, when user roles are not clearly defined, dashboards become misleading rather than empowering. A student marked as “graduating” in one dataset but “active” in another produces contradictory insights. A course offering list that merges archived and current codes inflates enrolment numbers. An email field reused across different students disrupts identity matching and engagement tracking.

    Architecture precedes analytics.

    Before visualisation, institutions must design a clean data schema:

    • Standardised programme codes across entities
    • Clear definitions of active vs. graduating status
    • Controlled user label taxonomy
    • Version-controlled course offering templates
    • Defined data refresh cycles

    Without architectural discipline, dashboards create false confidence. Leaders may make strategic decisions based on incomplete or corrupted datasets. Faculty may lose trust in reporting outputs. Administrators may spend weeks reconciling discrepancies manually before every board presentation.

    In effect, the dashboard becomes theatre—visually compelling, strategically hollow.

    A university aspiring to become AI-ready cannot bypass this layer. Artificial intelligence does not solve messy architecture; it amplifies it. Poorly structured data produces poorly informed automation. If governance is weak, AI integration accelerates inconsistency rather than efficiency.

    The Hidden Cost of Manual Clean-Up

    One of the most underestimated costs of failed digital transformation is manual clean-up.

    When architecture is weak, human labour becomes the compensating mechanism. Staff cross-check graduating lists against master enrolment sheets. Administrators manually correct user labels. Learning designers verify student identities before course copy exercises. Teams reconcile reports line by line before submitting compliance documents.

    This hidden labour rarely appears in transformation budgets.

    It manifests instead as burnout, frustration, and lost productivity. Highly skilled staff—hired to innovate—are reduced to data janitors. Instead of focusing on instructional design enhancement or AI integration pilots, they spend hours resolving discrepancies that should never have existed.

    The opportunity cost is significant.

    Time spent correcting misaligned data labels is time not spent designing scalable digital workflows.
    Time spent reconciling reports is time not spent developing analytics-driven interventions for at-risk students.
    Time spent troubleshooting identity mismatches is time not spent strengthening curriculum coherence.

    Moreover, manual clean-up creates a false perception of stability. Because teams “manage to fix it,” leadership may not recognise systemic weaknesses. The organisation survives through invisible effort rather than structural soundness.

    Over time, this erodes trust. Staff begin to question whether transformation initiatives are strategic or reactive. Innovation fatigue sets in. Resistance to new systems grows—not because people dislike technology, but because they associate it with additional invisible labour.

    Transformation fails quietly when manual work compensates for architectural neglect.

    The Absence of Definition of Ready and Workflow Clarity

    Another recurring issue in higher education digital initiatives is the absence of a clear Definition of Ready (DoR). Projects are launched without clarity on prerequisites, dependencies, or workflow sequencing.

    For example, a university may initiate a large-scale course copy exercise to standardise online offerings across campuses. Yet if the course offering template has not been validated, if programme codes are inconsistent, if data labels remain unresolved, the copy process multiplies errors rather than resolves them.

    Without workflow clarity:

    • Teams operate in parallel with misaligned assumptions.
    • Data is entered into multiple systems simultaneously without reconciliation.
    • Escalations occur reactively rather than systematically.

    Digital transformation requires process mapping before platform deployment. Swimlane diagrams, role clarity matrices, and escalation thresholds are not bureaucratic obstacles—they are enablers of efficiency.

    When workflows are ambiguous, staff default to informal communication channels. Decisions are made in meetings but not documented. Data corrections occur without traceability. Over time, institutional memory fragments.

    A transformation agenda without operational clarity creates confusion masquerading as agility.

    What Universities Underestimate About EdTech Adoption

    Universities often underestimate three dimensions of EdTech adoption: behavioural change, operational maturity, and governance discipline.

    First, behavioural change. Technology adoption is not a technical shift; it is a cultural one. Faculty members must trust that systems are reliable. Administrators must believe that data definitions are consistent. Leaders must model evidence-based decision-making rather than anecdotal preference. Without behavioural alignment, even well-designed systems remain underutilised.

    Second, operational maturity. Institutions with fragmented processes struggle to integrate digital tools coherently. If campus entities maintain independent templates, separate naming conventions, and informal reporting practices, cross-entity standardisation becomes complex. EdTech adoption requires alignment across academic affairs, registry, IT, and quality assurance functions.

    Third, governance discipline. Transformation requires sustained oversight. Data stewardship roles must be defined. Regular audits must be institutionalised. Architecture reviews must precede feature expansions. Governance is not a one-time exercise; it is an ongoing commitment.

    Many institutions treat EdTech as an add-on rather than a core operational layer. Yet in a digitally mediated learning environment, data architecture is infrastructure. It is as critical as physical classrooms once were.

    From Cosmetic to Structural Transformation

    An AI-ready ecosystem in higher education demands structural transformation. This means:

    1. Establishing a single source of truth for student identity and programme classification.
    2. Designing controlled taxonomies for user labels and course statuses.
    3. Embedding validation checkpoints before data enters downstream systems.
    4. Documenting workflows with explicit Definition of Ready criteria.
    5. Institutionalising periodic architecture audits prior to analytics expansion.

    Only when governance precedes tools can digital initiatives produce sustainable impact.

    The goal is not to accumulate platforms. It is to create coherence.

    When data flows cleanly, dashboards become meaningful. When architecture is stable, AI becomes trustworthy. When workflows are documented, scale becomes possible.  

    Transformation does not fail because universities lack ambition. It fails because they underestimate the foundational discipline required before implementation.

    Digital maturity is less about innovation theatre and more about operational integrity.

    The institutions that succeed will be those that recognise this early: transformation begins long before the first dashboard goes live. It begins in the invisible architecture beneath it.

  • Transformation Is Not About Speed. It Is About Execution Discipline

    Transformation Is Not About Speed. It Is About Execution Discipline

    Transformation is often described in the language of acceleration. Institutions speak about moving quickly, digitising rapidly, scaling efficiently, and staying ahead. In higher education especially, speed signals relevance. A new learning platform, a redesigned dashboard, or an AI-enabled feature creates visible evidence that progress is happening.

    But in complex institutional environments, transformation is rarely a speed problem. It is an execution discipline problem.

    Speed feels productive because it is visible. It creates momentum. It reassures stakeholders. Yet speed applied to unstable structures does not create transformation. It amplifies misalignment. It distributes weaknesses across a larger system. What appears efficient in the short term can create strain that surfaces later—during audits, accreditation reviews, reporting cycles, or leadership transitions.

    Visible change is only the surface layer. Beneath every digital platform or new initiative sits invisible architecture: data definitions, governance rules, workflow dependencies, ownership clarity, documentation standards, and compliance alignment. If that architecture is weak, speed accelerates fragility.

    Transformation is not proven at launch. It is proven under pressure.

    Activity Versus Execution

    One of the most common misunderstandings in organisational change is confusing activity with execution.

    Activity is easy to observe. Meetings are conducted. Templates are distributed. Systems go live. Reports are produced. Workshops are held. These actions create movement.

    Execution discipline is different. It requires clarity before movement. It asks: What does “ready” mean before development begins? Who owns each stage of the workflow? How is version control maintained? Where are quality checkpoints embedded? Are definitions consistent across departments? How does this align with regulatory expectations?

    Execution discipline is quieter. It may slow visible momentum at the beginning. But it strengthens coherence across the system.

    Without discipline, small inconsistencies accumulate. A misaligned data label seems minor until it affects reporting accuracy. An undefined moderation process appears manageable until grade disputes increase. An undocumented workflow functions adequately until a key staff member leaves.

    Execution discipline pays attention to these small fractures before they widen.

    Systems Thinking and Interdependence

    Institutions are not linear machines. They are interconnected systems. Decisions in one area influence outcomes in another.

    In higher education, for example, a change in course development processes may affect accreditation documentation, digital platform configuration, student reporting dashboards, faculty workload planning, and quality assurance reviews. None of these operate in isolation.

    When transformation focuses only on speed, it often treats systems as separate units. But when alignment is weak, acceleration spreads misalignment across multiple functions.

    A course may be uploaded quickly into a digital platform. Students may access materials without issue. However, if the course structure does not align with approved programme documentation, or if assessment weightings vary inconsistently across faculties, institutional risk increases quietly. During formal review cycles, those inconsistencies surface.

    Execution discipline recognises interdependence. It pauses to ask how each decision fits within the larger institutional structure. It prioritises coherence over immediacy.

    The Pressure to Appear Modern

    Institutions do not operate in isolation. They respond to competitive pressure, regulatory expectations, and peer comparisons. When other universities adopt new technologies or transformation narratives, the pressure to follow intensifies.

    Visible digital transformation becomes part of institutional identity. Speed becomes a symbol of innovation.

    Yet when transformation is driven primarily by optics, structure can be overlooked. A new platform may be implemented quickly to signal advancement. But if governance layers, data alignment, and workflow clarity are not embedded, operational strain emerges later.

    This strain appears as manual reconciliation before reporting deadlines, inconsistent data across campuses, unclear ownership of processes, and repeated rework each semester. These are not simply operational inefficiencies. They are symptoms of insufficient execution discipline.

    True transformation is not about appearing modern. It is about becoming structurally mature.

    The Middle Layer and Risk Containment

    In many institutions, execution discipline sits within the middle layer of leadership. Senior leaders set direction. Operational teams deliver tasks. Middle leaders translate ambition into structured practice.

    When this layer insists on standardising templates before scaling, aligning digital systems with approved academic frameworks, documenting workflows before automation, or clarifying accountability before delegation, the pacing may appear cautious.

    Yet this role functions as institutional risk containment.

    Without execution discipline at this level, transformation becomes dependent on individual effort rather than systemic stability. Processes rely on memory instead of documentation. Clarifications must be repeated each cycle. Operational continuity becomes vulnerable to staff turnover.

    Execution discipline reduces dependency on heroics. It replaces personal intervention with institutional structure.

    Governance as Infrastructure

    Governance is often misunderstood as unnecessary complexity. In reality, governance functions as infrastructure. It clarifies standards, defines accountability, and ensures consistency across time and scale.

    Without governance, organisations rely on informal understanding. With governance, they rely on shared and documented expectations.

    Sustainability is not a strategic slogan. It is the result of disciplined governance practices. When data definitions are standardised, workflows are documented, and approval processes are structured, institutions become less reactive. Accreditation reviews become procedural rather than stressful. Reporting becomes reliable rather than interpretative.

    Structure reduces anxiety because expectations are clear. When roles are defined and escalation paths documented, teams spend less time negotiating and more time executing.

    Discipline Enables Agility

    There is a common belief that discipline slows innovation. In practice, discipline enables agility.

    When systems are structured, decisions move faster. When ownership is explicit, accountability is immediate. When data can be trusted, analysis becomes meaningful rather than speculative.

    Agility without discipline is improvisation. Agility with discipline is controlled acceleration.

    Once execution discipline is embedded, speed becomes a natural outcome. Teams are not renegotiating expectations each time a new initiative begins. They are building upon established frameworks.

    Clarity reduces rework. Alignment reduces confusion. Documentation reduces dependency.

    Speed then emerges from structure.

    Redefining “Slow”

    The label “slow” often reflects discomfort with invisible work. Aligning naming conventions, refining data dictionaries, mapping digital systems to academic structures, and embedding quality checkpoints do not produce visible excitement.

    Yet these tasks determine whether transformation holds under pressure.

    The more strategic question is not how quickly something was implemented. It is whether it will withstand complexity. Will it remain coherent during leadership transitions? Will it scale across campuses without structural renegotiation? Will it survive regulatory scrutiny?

    Correction is always more expensive than prevention. Disciplined sequencing may extend initial timelines slightly, but it dramatically reduces long-term correction cycles.

    Execution discipline is not delay. It is durability.

    Structure Before Velocity

    Transformation should not be measured by how rapidly outputs are produced. It should be evaluated by how reliably systems function over time.

    Structural maturity includes aligned data architecture, embedded governance layers, documented workflows, and reduced reliance on individual intervention. It reflects a shift from reactive problem-solving to intentional system design.

    In higher education, where compliance, accreditation, and public accountability intersect, resilience is essential. Speed achieved without structure produces fragility. Structure embedded through disciplined execution produces stability. Stability enables scalable speed.

    Transformation is not about moving quickly enough to appear progressive. It is about building systems intentionally enough to endure.

    Execution discipline may not attract attention. It may even be misunderstood. Yet it is the foundation upon which sustainable transformation rests.

    In the long run, disciplined execution is not slower.

    It is simply stronger.

  • Education Technology: Yesterday, Today and Tomorrow

    Education Technology: Yesterday, Today and Tomorrow

    My journey with education technology did not begin with platforms, systems architecture, or artificial intelligence. It began in the classroom — with lesson plans, learners’ questions, moments of confusion, and moments of clarity. Like many educators, I entered the profession believing deeply in education as a space for growth, even if I did not yet have the language to articulate what that growth truly required.

    Over time, my career evolved from teaching to curriculum and instructional design, then into quality systems, followed by learning operations, and most recently into education technology.

    Looking back, these transitions were not random shifts, but layered progressions. Each role added a new way of seeing learning — pedagogy, structure, systems, and scale.

    Through this journey, my understanding of education technology has changed profoundly. What once felt like supportive tools now feels like an ecosystem that shapes behaviour, access, equity, and possibility. Reflecting on education technology across yesterday, today, and tomorrow is therefore not theoretical for me — it is deeply lived.

    Yesterday: Education Technology as Support and Substitution

    In the early stage of my career as a lecturer, education technology was largely about supporting instruction, not redefining it. PowerPoint replaced handwritten notes. Videos supplemented explanations. Online platforms became repositories for materials and assignments. Technology was seen as an enhancement — something that made lessons clearer, more engaging, or more efficient.

    Yet the classroom structure remained largely unchanged. The teacher was still the authority. The curriculum was fixed. Learners progressed at the same pace, regardless of individual differences.

    I used technology enthusiastically, but uncritically. If a tool worked and helped me deliver content better, it was considered effective. The dominant question then was, “How can I teach this better with technology?” rather than, “How might learning itself change because of technology?”

    Looking back, this phase reflects how education systems historically approached EdTech: as a digital substitute for existing practices. Worksheets, lectures, and assessments were digitised — but pedagogy remained intact. Learning was still measured by completion and recall rather than application or mastery.

    This was not a failure of technology. It was a reflection of how narrowly we understood learning at the time.

    From Teaching to Design: When Technology Exposed the Gaps

    My transition into curriculum development and instructional design marked the first major shift in how I perceived education technology. Designing learning at scale forced me to confront uncomfortable truths. Content alone was not enough. Well-written materials did not guarantee engagement. Carefully planned outcomes did not always translate into consistent delivery or learner competence.

    This is where education technology became more than a delivery mechanism — it became a mirror.

    Learning platforms revealed drop-off points. Assessment systems highlighted misalignment between outcomes and evaluation. Analytics surfaced patterns of struggle that individual classrooms often concealed.

    As an instructional designer, I began to see technology as an enabler of intentionality. It allowed us to:

    • Align learning outcomes, activities, and assessments more rigorously
    • Design learner journeys rather than isolated sessions
    • Test and iterate learning experiences
    • Capture data that informed continuous improvement

    The focus shifted from teaching content to designing learning experiences. Technology supported this shift not by being innovative, but by being structured, traceable, and scalable.

    Yet even at this stage, EdTech remained largely curriculum-centric. The learner experience improved, but systems were still often designed around programmes rather than people.

    Today: Education Technology as an Operational and Strategic System

    My move into quality assurance then learning operations fundamentally changed how I see education technology today.

    At the operational level, EdTech is no longer optional. It is the infrastructure that holds modern education together. Learning management systems, assessment platforms, analytics dashboards, content repositories, and collaboration tools now form interconnected ecosystems that determine how learning is governed, delivered, monitored, and improved.

    From this vantage point, technology is inseparable from:

    • Scalability
    • Compliance
    • Quality assurance
    • Standardisation
    • Risk management

    A well-designed curriculum can fail without the right systems to support it. A strong faculty can struggle without operational clarity. Education technology amplifies both excellence and weakness.

    One of the most significant changes in today’s EdTech landscape is the prominence of data. Learning is no longer invisible. Participation, progression, performance, and engagement can be tracked and analysed. Decisions can be evidence-based rather than anecdotal.

    However, this also introduces tension.

    As someone responsible for learning operations, I have seen how easily data can be misunderstood or misused. Dashboards can prioritise activity over learning. Metrics can create pressure rather than insight. Technology can slip from enabler to enforcer.

    Today’s challenge, therefore, is not access to technology — but governance, capability, and intent.

    Stepping into EdTech Leadership

    Starting my role as an EdTech Manager feels like standing at the intersection of pedagogy, technology, and strategy. This role has sharpened my belief that education technology should not lead education — it should serve it.

    Platforms must align with institutional philosophy. Tools must support educators, not overwhelm them. Innovation must be purposeful, not performative.

    At this stage of my career, I no longer ask whether a tool is advanced. I ask:

    • Does it improve learner experience meaningfully?
    • Does it support educators’ professional practice?
    • Does it enable quality, equity, and sustainability?
    • Does it align with long-term educational goals?

    EdTech leadership today requires more than technical fluency. It demands systems thinking, change management, ethical judgment, and deep respect for the human dimension of learning.

    Tomorrow: Education Technology as Intelligent, Invisible, and Human-Centred

    When I think about the future of education technology, I do not imagine more platforms or features. I imagine simpler, smarter, more human-centred ecosystems.

    Tomorrow’s EdTech will likely be:

    • Adaptive, responding to individual learner needs in real tim
    • Embedded, seamlessly integrated into learning and work
    • Skills-focused, emphasising demonstrated capability over seat time
    • AI-supported, but human-governed

    Artificial intelligence will increasingly support curriculum design, assessment, learner support, and analytics. AI tutors, feedback engines, and learning companions will reduce administrative load and allow educators to focus on higher-value interactions.

    However, the most important shift will not be technological — it will be philosophical.

    Education will move from:

    • Standardised pathways to personalised journeys
    • Time-based progression to mastery-based learning
    • Static qualifications to stackable, lifelong credentials

    In this future, the role of institutions and EdTech leaders will be to ensure that technology amplifies human potential rather than replaces human purpose.

    Ethics, accessibility, data privacy, and inclusivity will no longer be secondary considerations. They will be central to EdTech design and governance.

    My Personal Commitment Moving Forward

    Reflecting on my journey — from teaching to designing, from operations to EdTech leadership — I realise that each stage has shaped how I understand the responsibility that comes with education technology.

    • Yesterday taught me the importance of clarity and engagement.
    • Today taught me the power of systems and data.
    • Tomorrow demands that I act as a steward of learning ecosystems.

    As EdTech professionals and leaders, we are not merely implementing tools. We are shaping how people learn, adapt, and access opportunity in an increasingly complex world.

    The future of education technology will not be defined by how advanced our systems are, but by how thoughtfully we design them — with learners, educators, and society in mind.

    Conclusion

    • Education technology yesterday supported teaching.
    • Education technology today enables systems.
    • Education technology tomorrow must serve humanity.

    Standing now in new role, I am convinced of one thing:

    “Technology will continue to evolve, but education must remain deeply human.”

    Our task is not to chase innovation, but to design learning environments where technology quietly, ethically, and intelligently empowers people to grow.

    And that, truly, is what I aspire to bring to the table.

  • The Learning Continuum

    The Learning Continuum

    I didn’t start my career intending to become someone who thinks deeply about learning systems.

    I started as a lecturer — teaching students who came to class expecting knowledge, structure, and clarity. Later, I moved into a professional body, designing qualifications and certifications. Then I entered Learning & Development (L&D) in a corporate environment, where learning was supposed to translate directly into performance. Along the way, I also had a front-row seat to the realities of workplace training through my husband, a certified trainer — witnessing the pressure, expectations, and invisible work behind every “successful” training programme.

    Across these roles, one thing became increasingly clear to me:

    “We keep using the word learning — but we mean very different things.”

    And when we confuse teaching, training, and training for professional certification, people don’t just feel bored. They feel exhausted, disengaged, and quietly resentful of learning that doesn’t help them do their jobs better.

    Teaching: Building Understanding That Transfers

    Teaching is where my journey began. In teaching, the goal is understanding. Knowledge must stand independently of context. Students are expected to grasp concepts deeply enough to explain them, critique them, and apply them across scenarios that may not yet exist. A good teaching question sounds like:

    “Do you understand this concept in principle?”

    Time is generous. Exploration is encouraged. Assessment measures comprehension, reasoning, and intellectual clarity. Teaching is not rushed, because understanding cannot be rushed. And importantly:

    “Teaching does not promise immediate performance. It promises cognitive readiness.

    A student who understands a concept doesn’t just repeat what they’ve been taught. They can adapt it when the environment changes, when the tools change, when the problem looks different.

    That, I realised early on, is the essence of teaching building understanding that lasts.

    Training for Professional Certification: Understanding Meets Competence

    Years later, I joined a professional body, where I began designing qualifications. What I get to see with “professional certification” is that it is structured, controlled, and high-stakes. I had to consider:

    • What must every competent professional know?
    • What can reasonably be self-studied?
    • What requires formal instruction?
    • How do we assess competence fairly?

    On paper, the learning pathways were labelled “self-study”, and in practice, flexibility existed — formal structured learning was optional but always available. Still, the syllabus was tightly controlled. The right to train was granted only to accredited trainers and approved training agencies.

    Why?

    “Because professional certification is not just about learning. It is about standards, trust, and accountability.  Being certified carries a promise that a professional has met a defined, agreed-upon standard of competence.”

    This promise cannot rely solely on informal learning or on-the-job experience. It requires structure, formal assessment, and clear boundaries.

    Some experienced professionals push back – they say:

    “I’ve been doing this for years. Why do I need certification? My work speaks for itself.”

    In many ways, it does. But professional certification was never designed to judge individual brilliance. Professional certifications exist because systems need shared, portable standards. From a design perspective, experience is not ignored — it is refined. The modules are curated meticulously by industry experts, debated across committees, and stress-tested against current practice and future demands.

    “Professional certification takes what practitioners already know and aligns it with agreed industry standards, ethical boundaries, and future expectations. Without it, experience remains personal, but with it, competence becomes accountable.”

    It does not replace experience.

    It sharpens it.

    Most importantly, it doesn’t end once you pass the exam.

    To maintain credibility, certified professionals are obliged to undertake a defined number of learning hours through structured or unstructured learning activities. This is known as Continuous Professional Development (CPD). CPD ensures practices stay relevant, knowledge stays current, and the promise behind the professional certification continues to be meaningful. The professional certification is the baseline, and CPD is the ongoing commitment to remain credible and capable.

    Training: Seeing the Real Work of a Corporate Trainer

    Through my husband, I gained another perspective on learning — the world of corporate training. He started as an independent trainer, delivering workshops to diverse clients. Later, he joined an organisation as their in-house corporate trainer, responsible for designing, delivering, and measuring learning outcomes across multiple teams.

    Through him, I saw the real work behind the title “trainer” – it’s not just standing in front of a room and talking. It’s pressure, preparation, and precision:

    • Pressure from expectations: Learners come with different backgrounds, skills, attitudes and motivations. Some are there to learn, to improve themselves; others attend only because it’s mandatory. Trainers must meet everyone’s needs while keeping the session relevant and engaging.
    • Pressure from outcomes: Organisations want results. Bosses want their people to have the skill to get the job done. Training isn’t just “sharing knowledge” — it’s to improve performance, a fixing method, skill application, and behaviour change. Trainers are accountable for these outcomes, often under tight timelines.
    • Aspiration and craft: Good trainers aspire to more than delivery. They craft content carefully, anticipate challenges, design exercises that resonate, and measure transfer of learning. They balance engagement, relevance, and rigor, all in real time.

    Watching him, I realised that being a corporate trainer is part pedagogue, part psychologist, part project manager. You need empathy to understand your learners, influence to manage their bosses, strategy to design meaningful programmes, and stamina to deliver consistently under scrutiny.

    It also made me appreciate why some workplace learning succeeds, and some fails. A trainer’s skill can be brilliant, but if the system, expectations, or support is misaligned, even the best facilitator cannot make learning stick.

    One thing he often emphasises is:

    “Training is meant to improve performance, not dwell on theory. Yet, there is no such thing as ideal. We can’t run away from having to face some content that is inherently theory-heavy — complex systems, workflows, or technical tools. The challenge is structuring and delivering it, so learners remain engaged, connected to outcomes, and able to apply knowledge.”

    I saw this in action when he designed a system training module for his organisation. The module had a theory-heavy prerequisite delivered via e-learning, covering concepts staff had to understand before touching the system. Instead of letting it remain a dry, abstract experience, he implemented a hybrid approach:

    • Learners completed the e-learning module at their own pace, ensuring baseline knowledge. This became the pre-requisite before the classroom session.
    • Classroom sessions were hands-on, scenario-driven, and performance-focused, where learners applied concepts directly to tasks they would perform at work.
    • Exercises simulated real work conditions, allowing learners to practice, ask questions, and build confidence before independent application.

    This approach struck a balance – learners were prepared and knowledgeable, yet the training remained practical, relevant, and performance-oriented.

    Watching him, I realised that:

    “A good training design is both art and science. It cannot always be a one-size fits all approach. You have to respect theory when necessary but always keep one eye on the end goal – the learners’ competence and capability in the real world.”

    This brought me to a principle I now hold dear – training at work should always have a purpose. Even when you attend sessions on communication, personal grooming, or presentation skills, the goal is not just self-improvement — it’s about being better at your role, projecting credibility, and performing effectively.

    Training should be intentional, focused on reskilling or upskilling, and delivered based on actual need, not “just because there’s a course available.”

    When the purpose of training is clear, relevant and measurable in the work, people:

    • Engage meaningfully
    • Apply skills immediately
    • Retain knowledge
    • Take ownership of their development

    Without purpose, training risks becoming checkbox learning — attendance driven by perks, not by progress.

    Seeing the Differences Clearly

    Across my experiences, I’ve learned to articulate the differences between teaching, performance training, and training for professional certification clearly:

    AspectTeachingPerformance TrainingTraining for Professional Certification
    Primary GoalUnderstandingPerformanceVerified competence
    StructureHighFlexibleVery high
    Learning ContextClassroom / structuredWorkplace: 70–20–10 modelStructured + guided practice
    AssessmentKnowledge & reasoningTask performanceStandard-based, high-stakes
    TransferabilityHighContext-specificMedium–high (within profession)

    Each has its place, each has its limits, and problems arise when we ignore these boundaries.

    Standing in Between: What I’ve Learned

    I spent my career moving through these worlds – teaching, qualification design, and performance training.

    I wasn’t “just a lecturer”.

    I wasn’t “just L&D”.

    I wasn’t “just a qualification designer”.

    I was someone who stood between theory and practice, understanding where learning is about understanding, where it is about doing, and where it is about proving competence.

    That perspective allows me to see:

    • When teaching is essential
    • When training is enough
    • When certification must hold the line

    And if I reflect on all these experiences, a simple truth emerges:

    • Teaching should focus on understanding and transferability
    • Performance training should focus on immediate competence in context
    • Professional certification training should ensure understanding plus evidence of competence

    The difference between how it should be and how it actually is, is obvious — and that gap is where learning leaders must act.

    Not rush training for compliance.

    Not compress learning for speed.

    Not overload classrooms with “just in case” theory.

    Learning should not exhaust people. It should enable them.

    And when each form of learning is designed — and respected — for what it truly is, learning doesn’t just happen.

    It works.

  • Creating Learning that Works

    Creating Learning that Works

    In my experience working at the intersection of curriculum and learning & development frameworks, both processes felt familiar. Each follows a structured cycle of analysis, design, implementation, and evaluation — the logical sequence any learning professional would recognise. What differentiates them, however, are their intent and impact.

    A curriculum framework operates at the micro level — it is educational, instructional, and learner-centred. It defines learning experiences, content, and assessments to develop specific capabilities, translating strategic intent into tangible, measurable learning journeys that progress logically over time. An L&D framework, on the other hand, operates at the macro level — it is organisational, strategic, and systemic. It identifies the capabilities the organisation needs to thrive, whether in leadership, communication, or digital literacy, and ensures that learning aligns with business priorities, culture, and performance outcomes. In short, it is the architecture of learning at scale, designed to shape the workforce for what’s next.

    In a corporate ecosystem, the L&D framework sets the direction, while curriculum frameworks bring that direction to life. And at the heart of it all — the real audience is the employees themselves.

    Recently, I found myself asking a question that felt both simple and profound:

    • How do employees in Malaysia really feel about their organisation’s learning initiatives?
    • Do they see them as a genuine pathway to growth — or just another HR process that looks good on paper?

    To explore this, I decided to collect some data myself. Not as part of a formal research project, but out of genuine curiosity. I wanted to understand, from employees’ perspectives, how learning opportunities are communicated, accessed, rewarded, and supported in real workplaces today.

    My early findings mirrored what I had long suspected: employees want to learn. They are eager to grow, stay relevant, and contribute meaningfully. But structural and cultural barriers persist — and they have a real impact on how learning happens at work.

    Research supports this too. Studies by LinkedIn Workplace Learning Report (2024) echo what many of us already know:

    • Workload remains the biggest obstacle to learning.
    • Communication on upskilling and reskilling opportunities is often inconsistent.
    • Recognition is unclear — only a fraction of organisations link learning milestones to rewards, promotions, or visibility.

    The result? Employees navigate a maze of opportunities without a map. They see programmes, but not pathways. Initiatives, but not impact. And when learning feels disconnected from growth, participation becomes compliance, not commitment.

    As the designer of learning systems, and a participant within them, here’s what I’ve learned about what makes learning work:

    ✅ 1. Communication Builds Clarity

    Employees can’t align with what they don’t understand. Learning & Development strategies must be communicated with the same intensity and clarity as business goals — repeatedly, transparently, and in ways that connect to personal growth. Without clarity, even the best-designed L&D strategy risks becoming noise.

    ✅ 2. Structure Builds Trust

    When people know how to enrol, what criteria apply, and what outcomes to expect, they engage with ownership. Ambiguity erodes trust; structure builds it. A good L&D framework provides a roadmap — making learning accessible, predictable, and equitable.

    ✅ 3. Recognition Sustains Motivation

    Motivation doesn’t always need to be monetary. Recognition can come in many forms — certifications, project leadership opportunities, internal visibility, or acknowledgment from managers. What matters is fairness and visibility. Without it, engagement fades and learning becomes transactional.

    ✅ 4. Balance Sustains Performance

    As the LinkedIn Workplace Learning Report (2024) highlights, nearly half of global employees (49%) say workload prevents them from pursuing learning. Organisations that provide protected learning time or adjust workloads consistently see higher engagement and retention. Real commitment means making space for growth, not just preaching it.

    Another point worth mentioning on why learning initiatives often struggle is the way organisations separate talent strategy and learning strategy. Talent strategy focuses on identifying, developing, and retaining high-potential employees for current and future roles. Learning strategy focuses on designing and delivering programs that build skills and capabilities. When these functions operate in silos — sometimes even with different owners and KPIs — learning and talent can feel like competing priorities. Learning teams may focus on course completion metrics, while talent teams focus on succession or retention goals. The result? Initiatives are disconnected, employees are confused, and the organisation misses the opportunity to develop people holistically.

    Bridging this gap requires integration, alignment, and shared ownership. Learning becomes a tool to grow talent, and talent strategy becomes a lens through which learning programmes are designed.

    Another observation to add on is the trends in Malaysia’s learning landscape these days – Malaysia’s economic growth is being driven by digital transformation, manufacturing innovation, and a renewed services sector. This shift is redefining what L&D frameworks must deliver. Three trends have came to the spotlights:

    ✅ 1. AI and Automation Are Rewriting the Skills Playbook

    Core skills now include AI literacy, digital adaptability, and continuous learning. L&D frameworks must evolve from static competency models to dynamic, continuously updated capability ecosystems.

    ✅ 2. Employees Want Learning to Be Meaningful, Not Mandated

    Learning requires space, focus, and support. It’s not about the number of courses completed, but the depth of growth achieved.

    ✅ 3. Environment Shapes Motivation

    Culture matters. When peers and leaders value growth, curiosity spreads naturally. When learning is treated as an afterthought, enthusiasm fades — no matter how good the content is.

    Despite initiatives, 77% of APAC employers report difficulty filling roles, especially in data and tech (ManpowerGroup, 2025). This isn’t just a hiring issue — it’s a development issue. Organisations can’t recruit their way out of a skills gap; they must develop their way out.

    The way I see it, it’s time we create the human-centred future of learning. The foundation is already there: a workforce eager to learn. What’s needed now is alignment, clear pathways, and leadership that sees learning not as an interruption, but as an investment. Learning should mirror growth — fluid, flexible, and human. It should encourage curiosity, not compliance; reflection, not repetition. Successful L&D frameworks balance structure with empathy, celebrate milestones while focusing on meaning, and connect learning not just to performance, but to purpose. When learning becomes part of the organisational DNA — woven into conversations, performance reviews, and leadership decisions — culture changes. Employees begin to see growth not as an expectation, but as a shared journey.

    Looking back, my journey from curriculum design to L&D strategy reminds me: learning is never static — it evolves as people do. The curriculum framework taught me precision — to think about sequences, outcomes, and assessments. The L&D framework taught me vision — to connect learning with culture, systems, and strategy. Both are necessary. One provides the how; the other ensures the why. And more importantly, both remind me that learning, at its core, is relational. It’s about people — their stories, aspirations, and the systems that either support or stifle their growth.

    My hope is simple: That one day, every employee can proudly say —

    “My company’s learning culture doesn’t just demand growth from me; it grows with me.”

  • Building Structure from Legacy Knowledge

    Building Structure from Legacy Knowledge

    There’s a peculiar kind of silence that lingers in organisations that run purely on legacy knowledge. It’s not the silence of inactivity—it’s the silence of familiarity. Everyone knows what to do, but no one quite knows how they know it. Things just work—until they don’t.

    When I first walked into such an environment, the company had been operating for decades without structured documentation. Processes were embedded in people, not paper. The ones who “knew how things worked” had been there for years, sometimes decades. SOPs, process maps, reference manuals are presented like habits, muscle memory, and the occasional Excel sheet that everyone swore was “the latest version.”

    Gulp!

    As someone whose professional grounding has always revolved around building structure, I have always believed that institutional excellence begins with clarity. But clarity doesn’t appear on its own—it needs to be designed, validated, and communicated. My mission was to turn tacit knowledge into explicit systems: to write the SOPs that never existed.

    How did I do that?

    Step 1 – Understanding the Unwritten

    Before writing a single word, I needed to understand the unwritten rules. I learned quickly that legacy organisations don’t resist documentation because they’re careless—they resist it because it threatens the familiarity that makes their work feel stable.

    So instead of starting with templates, I started with people. I spent time shadowing teams—sitting beside coordinators, analysts, and trainers; listening to how they explained things to new hires; and noting down what they did differently from what others claimed to do. These were not formal interviews but conversations of trust.

    When I asked, “How do you usually do this?”, the answer often began with, “Normally we…” followed by a mix of “but sometimes…” and “it depends.”

    That was my first insight — the SOPs didn’t just need to document the process; they needed to capture the logic behind the decisions.

    Step 2 – Mapping the Chaos

    Legacy processes are rarely linear. They grow organically—layer by layer, patch by patch. To make sense of them, I used a bottom-up process mapping approach.

    Rather than starting with a departmental workflow, I identified process clusters—recurring patterns of activities that led to specific outcomes. For instance, in a learning operations environment, “course activation” could involve HR, content, logistics, trainers, and even finance. Everyone had a hand in it, but no one owned the end-to-end picture. Using visual process maps (I prefer Lucidchart or Miro), I traced each step as it was currently done, not as it should be done. That distinction was crucial.  Documenting “ideal processes” too early can alienate those who actually run them. By mapping the reality first, I earned credibility, and most importantly, people saw that I wasn’t here to “change everything,” but to understand.

    Once the as-is map was complete, I would gather the stakeholders in short, focused sessions to validate it. This often led to surprising discoveries: redundant approval loops, outdated forms, or responsibilities that had shifted over time without anyone realising it.

    Step 3 – Defining Ownership and Accountability

    In companies that run on legacy knowledge, ownership is fluid.

    We’ve always done it this way” often translates to “we’re not sure who’s in charge.”

    So, the next step was to clarify process ownership. Every SOP, no matter how simple, needed to answer one key question – Who owns this process from start to end? This was not a matter of hierarchy—it was a matter of clarity. For example, if a process involved four departments, one had to be the process owner (responsible for ensuring compliance and improvement), while others were contributors.

    Introducing this concept wasn’t easy. It required diplomacy and empathy. In some cases, it meant revisiting turf boundaries. But over time, people began to see the value—when ownership was clear, so were expectations.

    Step 4 – Writing for the Reader

    Once the groundwork was set, it was time to write.

    I believe that SOPs are not just documents—they are learning tools. They need to be simple enough for a new hire to follow yet detailed enough for an auditor to verify. So, I structured each SOP with three key sections:

    • Purpose and Scope – Why the SOP exists and what it covers.
    • Roles and Responsibilities – Who does what, clearly stated.
    • Step-by-Step Procedure – The actual process, written in a concise, active voice, supported by visuals or decision trees when needed.

    For each procedure, I included “critical control points”. These are steps that could impact compliance, customer experience, or data accuracy. These would help transform the SOP from a static manual into a dynamic quality tool.

    Step 5 – Validation and Continuous Feedback

    No SOP should ever be written in isolation. Once drafts were ready, I organised validation walkthroughs, for example, live sessions where process owners would perform the tasks using the draft SOP as their guide.

    This was often where the magic happened. Watching someone struggle to find a step or interpret an instruction revealed exactly where the SOP failed to communicate. One of the most important lessons I learned was this:

    “The effectiveness of an SOP is not in how it reads, but in how it guides”.

    Through each round of feedback, bear in mind that the documents will evolve not only in accuracy but also in voice, gradually beginning to sound like the people who use them rather than like corporate templates written by outsiders.

    Step 6 – Implementation and Change Management

    Documentation is the easy part. Adoption is the real challenge. I quickly learned that rolling out SOPs is a change management exercise, not just a documentation project. Legacy organisations have muscle memory and changing that requires more than a PDF upload to the shared drive.

    To encourage adoption, I used three key strategies:

    • Microlearning orientation: Short sessions introducing each SOP and why it mattered.
    • Champion system: Appointing process champions within departments to answer questions and reinforce consistency.
    • Feedback loop: Creating a formal channel (e.g., monthly review form or MS Teams chat) for users to flag inconsistencies or propose improvements.

    Over time, what started as a documentation initiative evolved into a culture of accountability. Teams began using SOPs as a reference, not as a chore. The technical steps are one thing, but the human side was the real learning curve.

    I remember one senior staff member telling me, “I’ve been doing this for 15 years. I don’t need an SOP to tell me how.” She was right, in a way. She didn’t need it—but the organisation did.

    Even my own boss, once said, “SOPs are a waste of time – everyone already know how to do it from the back of their head. The rest should learn the same way – we don’t need to spoon-feed them!

    This moment reminded me of the delicate balance between respecting expertise and institutionalising knowledge. The goal was never to replace people’s experience, but to preserve it—so that when they move on, retire, or change roles, the organisation doesn’t start from zero again.

    Through empathy, consistent communication, and genuine curiosity, I began to see attitudes shift. When people felt heard, they became open to documenting their own workflows. Some even took ownership by proposing improvements or volunteering to pilot-test new templates.

    The success of an SOP initiative CANNOT be measured only by the number of documents completed. Real impact shows up in how people work differently.

    At the end, what does success look like here?

    • New joiners were able to onboard faster.
    • Cross-departmental miscommunications reduced because everyone now had a shared reference.
    • Quality checks became easier because expectations were clear.
    • Most importantly, decision-making became more transparent.

    Imagine, ISO auditors reviewing the documentation, and commented that your SOPs “reflected how people actually worked,” not just theoretical procedures.

    That, to me, is the ultimate validation.

    Writing SOPs in a legacy-driven company is less about writing and more about translating culture into structure. Writing SOPs is not just a technical task—it’s an act of transformation. You are, in essence, building the bridge between legacy and sustainability.

    What began as a documentation project often turns into a cultural awakening. You start by asking, “How do you do this?” and end up uncovering why things are done this way. Along the way, you learn that documentation is not about control—it’s about continuity.

    For me, this experience reinforced a belief I’ve always held – operational excellence begins with clarity, and clarity begins with people.

    The most meaningful SOPs I’ve written were not those that ticked every compliance box, but those that gave people a sense of structure, pride, and confidence in their work. They became, in many ways, a mirror of the organisation’s collective wisdom—finally captured, finally shared, and finally ready to evolve.