Halaman

From Legal Gatekeeper to Cognitive Auditor: Transforming the Role

ARTIKEL JURNAL Q3 · Cognitive Auditor

From Legal Gatekeeper to Cognitive Auditor:
Transforming the Role of Government Legal Units in Policy Evaluation in the Age of Epistemic Governance

Accountability‑Based Universal Wisdom and Trust (ABUWT)
Cross‑Sector Pre‑Decision Governance Translator

Manuscript for Q3 Journal – March 2026

ABSTRACT

Government legal units in many OECD and middle‑income administrative systems hold a strategic mandate in policy evaluation. Yet evaluation practices remain dominated by a legal‑formal paradigm focused on procedural compliance, legality, and drafting techniques, leaving the dimension of pre‑decision reasoning quality—assumptions, framing, option exploration, and documentation of dissent—largely untouched. This conceptual article develops a novel framework—Pre‑Decision Governance (PDG)—through narrative synthesis of international literature on epistemic governance, policy learning, deliberative governance, and regulatory quality assurance. The article argues that current evaluation instruments, while normatively sound, fail to capture reasoning quality due to lack of standardized methodologies and formalistic implementation. Four PDG protocols are proposed: assumption testing, counter‑framing, multi‑option mandate, and structured dissent. The article contributes to regulatory governance literature by shifting the evaluative locus from procedural compliance to epistemic robustness, conceptualizing legal units not merely as legal gatekeepers but as institutional cognitive auditors—a functional reframing that extends their analytical role without creating new bureaucratic positions. Implementation readiness analysis suggests that transforming legal units requires gradual capacity building and acknowledges that PDG operates within existing power structures, enhancing reasoning transparency rather than eliminating political influence.

Keywords: policy evaluation, legal units, epistemic governance, pre‑decision governance, regulatory quality, cognitive auditing

1. INTRODUCTION

1.1 The Persistent Paradox in Policy Evaluation

Governments across many OECD and middle‑income countries have invested significantly in improving regulatory and policy quality over the past two decades. The OECD’s regulatory policy agenda has promoted evidence‑based tools such as Regulatory Impact Assessment (RIA), stakeholder engagement, and ex‑post evaluation (OECD, 2021). Many countries have established central oversight bodies, including the Office of Information and Regulatory Affairs (OIRA) in the United States, the Regulatory Policy Committee in the United Kingdom, and similar institutions across OECD member states (Radaelli & De Francesco, 2019). Digital transformation has enabled new forms of inter‑agency coordination and data sharing, as seen in the European Union’s Better Regulation agenda and various national e‑government platforms (European Commission, 2023).

Yet a persistent paradox remains. Policy failures continue to occur within formally compliant processes. Major infrastructure projects stall not because of data inconsistency, but because underlying assumptions—about construction timelines, economic growth, or program effectiveness—remain untested and unaligned across agencies. Dissenting views, when they emerge, disappear into informal channels, leaving no trace for future learning.

Government legal units—whether called legal bureaus, offices of legal counsel, or legislative drafting divisions—stand at the front lines of this challenge. They are responsible for drafting regulations, facilitating inter‑agency coordination, and evaluating existing policies. However, a growing body of literature suggests that policy evaluation by legal units faces systemic challenges that remain unaddressed (Dunlop & Radaelli, 2022; Heikkila & Gerlak, 2019).

1.2 Problem Statement

This article addresses three research questions:

  1. Why do existing policy evaluation instruments—such as impact assessments, public consultations, and ex‑post reviews—fail to capture the quality of pre‑decision reasoning, despite being normatively mandated?
  2. How can a Pre‑Decision Governance (PDG) framework, synthesized from international literature, offer a new perspective for transforming the role of government legal units?
  3. What are the conceptual and practical implications of this role transformation, considering human resource readiness, bureaucratic culture, and the realities of political power?

1.3 Approach and Contribution

This article adopts a conceptual approach based on narrative synthesis of international literature. Unlike empirical studies that test hypotheses with data, this article aims to contribute a new conceptual framework (Pre‑Decision Governance) that enriches discourse on policy evaluation.

The article makes a distinctive contribution to regulatory governance literature by shifting the evaluative locus from procedural compliance to epistemic robustness, conceptualizing legal units not merely as legal gatekeepers but as institutional cognitive auditors. This reframing opens new avenues for both scholarly inquiry and practical reform.

2. THEORETICAL FRAMEWORK

2.1 Epistemic Governance

Epistemic governance (Jalonen, 2025; Lidskog & Sundqvist, 2025) emphasizes the processes of knowledge production, validation, and dissemination in governance. Jalonen (2025) defines epistemic governance as “the processes shaping collective perceptions and influencing the understanding of a situation,” arguing that in complex, crisis‑prone environments, governance must move beyond traditional models to embrace uncertainty and diverse forms of knowledge. Lidskog and Sundqvist (2025), studying the Intergovernmental Panel on Climate Change (IPCC), identify how “epistemic hierarchies” and disciplinary diversity create challenges for maintaining coherence in global assessments.

This literature operates primarily at the macro‑level of knowledge systems. PDG operationalizes epistemic governance at the micro‑institutional level within bureaucratic policy production. While epistemic governance addresses how entire policy systems produce and validate knowledge, PDG focuses specifically on the internal reasoning processes of government legal units—how assumptions are formulated, how problems are framed, and how dissent is managed in routine policy development. This micro‑level application complements rather than replicates the macro‑level insights of epistemic governance theory.

2.2 Policy Learning

Policy learning literature (Dunlop & Radaelli, 2022; Heikkila & Gerlak, 2019) examines mechanisms through which organizations learn from policy successes and failures. Dunlop and Radaelli (2022) develop a framework for understanding policy learning in comparative perspective, identifying different learning modes depending on context and actors involved. Heikkila and Gerlak (2019) highlight how institutional rules affect organizational learning capacity, finding that weak feedback mechanisms, inadequate documentation, and lack of institutional reflection hinder learning from experience.

2.3 Deliberative Governance

Deliberative governance theory (Ansell & Gash, 2007; Habermas, 1996) emphasizes the importance of dialogue and reasoned argumentation in legitimate collective decision‑making. For policy evaluation, this literature highlights that the quality of public participation and deliberative processes matters as much as substantive decisions. Formalistic public consultations, without adequate documentation and response mechanisms, fail to generate legitimacy or improved policy quality.

2.4 Regulatory Quality Assurance

Regulatory quality assurance literature (OECD, 2021; Radaelli & De Francesco, 2019) addresses standards for regulatory quality, including impact assessment, stakeholder consultation, and ex‑post evaluation. Radaelli and De Francesco (2019) examine Regulatory Impact Assessment (RIA) practices across countries, finding that implementation often falls short of ideals. RIA frequently becomes a “compliance exercise” rather than substantive analysis due to time pressure, limited capacity, and lack of standardized methodology.

2.5 Synthesizing the Literature: Conceptual Gaps

From synthesizing these four literatures, conceptual gaps in current policy evaluation practices can be identified:

AspectIdeal PracticeActual PracticeGap
Knowledge productionAssumptions tested, data verifiedAssumptions implicit, data not verifiedNo assumption‑testing mechanisms
Knowledge disseminationSubstantive public consultationFormalistic consultationInput not systematically documented
Knowledge validationScrutiny includes reasoning qualityScrutiny focuses on legal‑formal aspectsReasoning quality not validated
Institutional learningEx‑post evaluation for improvementEvaluation for compliance onlyFailures from flawed assumptions not learned

3. THE PRE‑DECISION GOVERNANCE FRAMEWORK

3.1 Conceptual Foundations

Pre‑Decision Governance (PDG) is developed as a synthesis of the four literatures above. It focuses on the quality of reasoning processes before decisions are made, rather than solely on outcomes or formal procedures. Four PDG pillars are formulated:

PillarDescriptionTheoretical Foundation
Assumption TestingIdentifying and testing critical assumptions underlying policy proposalsEpistemic governance (micro‑level operationalization), assumption‑based planning (Dewar, 2002)
Counter‑FramingExploring problem definitions from multiple perspectivesFraming theory (Rein & Schön, 1993), deliberative governance
Multi‑Option MandateAnalyzing at least three substantially different policy alternatives equallyMulti‑criteria analysis (Keeney & Raiffa, 1993), policy learning
Structured DissentCreating institutionalized channels for dissenting viewsGroupthink theory (Janis, 1982), cognitive bias (Sunstein & Hastie, 2015)

PDG operationalizes epistemic governance at the micro‑institutional level by translating abstract principles of knowledge validation into concrete protocols for bureaucratic reasoning. While epistemic governance addresses system‑wide knowledge dynamics, PDG provides the procedural infrastructure for implementing those principles within the routine work of legal units.

In this article, reasoning quality refers to the explicit articulation, testing, documentation, and contestation of assumptions underlying policy proposals.

3.2 Why PDG Differs from Existing Practices

Existing policy evaluation instruments—RIA, public consultation, legal harmonization—have worthy goals but often fail in implementation due to:

  1. Lack of standardized methodology for documenting assumptions, testing framing, and exploring options. Consequently, analytical quality varies widely across units.
  2. Formalistic implementation – Existing instruments frequently become administrative formalities rather than integral parts of reasoning testing.
  3. Legal‑formal focus – Harmonization and evaluation remain dominated by legal‑formal perspectives, ignoring pre‑decision reasoning quality.

PDG offers a complementary layer that can be integrated without altering existing structures. It does not replace RIA or public consultation but adds a previously overlooked dimension.

3.3 Testable Propositions

This article advances three propositions for future empirical testing:

Proposition 1: Legal units that consistently document and test critical assumptions in policy development processes produce policies with fewer post‑implementation corrective amendments attributable to assumption error.

This formulation avoids conflating political revisions with quality‑related revisions, focusing specifically on amendments traceable to flawed initial assumptions.

Proposition 2: The availability of structured dissent mechanisms in harmonization processes positively correlates with the quality of post‑implementation evaluation, as measured by the proportion of identified implementation problems that were anticipated during formulation.

Proposition 3: PDG integration into existing evaluation practices requires gradual capacity building and the presence of reform champions within units, and cannot be implemented through top‑down mandates alone.

4. IMPLICATIONS FOR GOVERNMENT LEGAL UNITS

4.1 Role Transformation: Cognitive Auditor as Functional Reframing

The term cognitive auditor is introduced as a functional reframing rather than a proposal for new bureaucratic positions. It describes an extension of the analytical role that legal units can adopt within existing mandates, not a new job classification requiring organizational restructuring.

AspectTraditional Role (Legal Gatekeeper)Extended Role (Cognitive Auditor)
FocusFormal compliance, legalityReasoning quality, assumptions, framing
Key question“Is this legal?”“Does this make sense?”
MethodsDocument verification, article comparisonAssumption testing, framing analysis, dissent documentation
OutputsLegal opinions, formal recommendationsReasoning quality assessments, assumption improvement notes
Competencies neededLegal knowledge, drafting skillsPolicy analysis, basic statistics, facilitation skills

The cognitive auditor role does not replace the legal gatekeeper function; it adds an analytical layer to existing responsibilities.

4.2 Simple Instruments for Adoption

Legal units can adopt simple instruments without overburdening existing systems:

  1. Assumption Mapping Form (one page) – Documents critical assumptions, data sources, and confidence levels.
  2. Assumption Testing Sheet (one page) – Records results of assumption testing against historical data or benchmarks.
  3. Dissent Documentation Form (one page) – Records dissenting views, arguments, and follow‑up actions.
  4. Simple Reasoning Quality Index – Scores 1‑4 for each PDG pillar, used for internal learning rather than performance evaluation.

These instruments can be piloted gradually in units with progressive leadership.

5. IMPLEMENTATION READINESS AND POLITICAL CONSTRAINTS

5.1 Human Resource Readiness

Literature indicates that legal units across countries generally have limited policy analysis capacity (Radaelli & De Francesco, 2019). However, this is not an absolute barrier. Organizational capacity development literature (Andrews, 2013) shows that change can begin with small steps and gradual learning.

Realistic development strategies:

  • Short‑term: Basic policy analysis training and simple instrument use.
  • Medium‑term: Cross‑disciplinary recruitment (policy science, statistics, economics).
  • Long‑term: Development of analytical specialization within legal units.

5.2 Bureaucratic Culture Readiness

Bureaucratic cultures in many administrative systems tend to be hierarchical, risk‑averse, and resistant to practices that might challenge leadership assumptions. Institutional change literature (Mahoney & Thelen, 2010) suggests change can occur through layering—adding new elements onto existing structures without radical transformation. PDG is designed as layering: it does not alter existing hierarchies but adds new protocols that can be gradually adopted by ready units.

5.3 Acknowledging Political Constraints

A critical qualification is necessary: PDG operates within existing power structures and cannot neutralize political override. Its primary function is to enhance reasoning transparency and create documented traces of deliberation, not to eliminate political influence. Decisions will continue to be shaped by power dynamics, political pressures, and hierarchical authority. PDG’s contribution lies in ensuring that when political override occurs, it is documented and its reasoning—or lack thereof—becomes visible for future accountability and learning. The framework does not assume technocratic receptiveness; it provides tools for reformist minorities within systems that may remain politically constrained.

6. CONCLUSION AND RECOMMENDATIONS

6.1 Conclusion

This article has developed a conceptual framework for understanding policy evaluation challenges facing government legal units and offered a new perspective through Pre‑Decision Governance. The main arguments are:

  1. Existing evaluation practices remain dominated by a legal‑formal paradigm that ignores pre‑decision reasoning quality.
  2. Epistemic gaps occur because existing instruments are not designed to capture reasoning quality, and implementation remains formalistic.
  3. The PDG framework, synthesized from epistemic governance, policy learning, deliberative governance, and regulatory quality literature, offers a complementary layer to address these gaps.
  4. Transforming legal units requires gradual capacity building and operates within existing power structures, enhancing reasoning transparency rather than eliminating political influence.

The article contributes to regulatory governance literature by shifting the evaluative locus from procedural compliance to epistemic robustness, conceptualizing legal units not merely as legal gatekeepers but as institutional cognitive auditors—a functional reframing that extends their analytical role without creating new bureaucratic positions.

6.2 Recommendations

StakeholderRecommendations
Government legal unitsPilot simple instruments on strategic policies; develop staff capacity gradually.
Central oversight bodiesConsider adding reasoning quality dimensions to existing evaluation guidelines.
Legislative bodiesRequest assumption documentation in policy deliberations where feasible.
International organizationsFacilitate cross‑country exchange of good practices and capacity development.

6.3 Future Research Agenda

This study has limitations as a conceptual inquiry. Future research should:

  1. Test the proposed propositions through empirical studies across different national contexts.
  2. Develop and validate instruments for measuring reasoning quality.
  3. Examine how PDG protocols interact with different political and bureaucratic cultures.
  4. Investigate the conditions under which reformist minorities successfully adopt such frameworks within resistant systems.

Figure 1 – PDG as a Layering Framework

┌─────────────────────────────────────────────────────────────┐ │ EXISTING INSTITUTIONAL LAYER │ │ (RIA, public consultation, legal harmonization, etc.) │ └─────────────────────────────────────────────────────────────┘ │ ▼ ┌─────────────────────────────────────────────────────────────┐ │ PRE‑DECISION GOVERNANCE │ │ ┌─────────────┬─────────────┬─────────────┬─────────────┐ │ │ │ Assumption │ Counter- │ Multi-Option│ Structured │ │ │ │ Testing │ Framing │ Mandate │ Dissent │ │ │ └─────────────┴─────────────┴─────────────┴─────────────┘ │ └─────────────────────────────────────────────────────────────┘ │ ▼ ┌─────────────────────────────────────────────────────────────┐ │ LEGAL UNITS AS COGNITIVE AUDITORS │ │ (enhanced analytical role within existing mandates) │ └─────────────────────────────────────────────────────────────┘

REFERENCES

Andrews, M. (2013). The limits of institutional reform in development. Cambridge University Press.

Ansell, C., & Gash, A. (2007). Collaborative governance in theory and practice. Journal of Public Administration Research and Theory, 18(4), 543-571.

Dewar, J. A. (2002). Assumption‑based planning: A tool for reducing avoidable surprises. Cambridge University Press.

Dunlop, C. A., & Radaelli, C. M. (2022). Policy learning in comparative perspective: A new analytical framework. Policy & Politics, 50(1), 3-24.

European Commission. (2023). Better Regulation Guidelines. Publications Office of the European Union.

Habermas, J. (1996). Between facts and norms. MIT Press.

Heikkila, T., & Gerlak, A. K. (2019). Working on learning: How the institutional rules of environmental governance matter. Journal of Environmental Policy & Planning, 21(1), 92-107.

Jalonen, H. (2025). Epistemic governance in the context of crisis: A complexity‑informed approach. Administration & Society, 57(2), 218-253.

Janis, I. L. (1982). Groupthink (2nd ed.). Houghton Mifflin.

Keeney, R. L., & Raiffa, H. (1993). Decisions with multiple objectives. Cambridge University Press.

Lidskog, R., & Sundqvist, G. (2025). Expert advice and global environmental governance: Institutional and epistemic challenges. Sustainability, 17(17), 7876.

Mahoney, J., & Thelen, K. (2010). Explaining institutional change: Ambiguity, agency, and power. Cambridge University Press.

OECD. (2021). OECD Regulatory Policy Outlook 2021. OECD Publishing.

Radaelli, C. M., & De Francesco, F. (2019). Regulatory quality assurance and impact assessment. Edward Elgar.

Rein, M., & Schön, D. (1993). Reframing policy discourse. In F. Fischer & J. Forester (Eds.), The argumentative turn in policy analysis (pp. 145-166). Duke University Press.

Sunstein, C. R., & Hastie, R. (2015). Wiser: Getting beyond groupthink to make groups smarter. Harvard Business Review Press.