INTERNATIONAL CENTER FOR RESEARCH AND RESOURCE DEVELOPMENT

ICRRD QUALITY INDEX RESEARCH JOURNAL

ISSN: 2773-5958, https://doi.org/10.53272/icrrd

Evidence Makers

Evidence Makers

The research world is getting flipped on its head – frontline professionals aren't just reading studies anymore, they're writing them. This transformation runs on three critical infrastructure layers: systematic documentation methodologies, collaborative research networks, and technology platforms. These elements let practitioners produce evidence that meets the same rigorous standards traditionally reserved for academic settings. The central question isn't whether practitioners can generate credible evidence, but how infrastructure enables them to meet standards that regulatory bodies now recognise as equivalent to traditional research.


Research used to be academia's exclusive club. Universities had the funding, protected time, and peer-review structures that operational settings lacked. Practitioners were cast as implementers of research findings rather than originators of knowledge. The infrastructure layers emerging now let practitioners generate evidence during their actual work, not as some separate academic exercise. This expansion of who can produce credible evidence represents a fundamental shift in knowledge production.


From Clinical Observation to Published Research

The gap between clinical observation and research evidence isn't about who's doing the observing. It's about the methodological structure they apply to their findings. Clinical observations involve noticing patterns. Research evidence requires systematic retrieval, quantification, and structured analysis to meet peer-review standards.


Without methodology, clinical observations are basically anecdotes with stethoscopes.


This requires systematic documentation methodologies that transform routine clinical work into rigorous research frameworks. Dr Amelia Denniss, an Advanced Trainee physician working across metropolitan, regional, and rural settings in New South Wales, provides one example of this approach. During a clinical project at Kirakira Hospital in the Solomon Islands, she co-designed a two-year retrospective audit from July 2015 to July 2017. This involved systematically retrieving patient files and estimating inpatient bed-day utilisation. Her findings revealed that tuberculosis treatment consumed 15% of the Makira-Ulawa Province healthcare budget and identified diagnostic and monitoring gaps. These insights were published in Rural and Remote Health in May 2019, informing recommendations for improved testing methods. The difference between her work and routine clinical notes? Structure turned observations into evidence.


Horace Cox, director of surveillance, disease prevention, and control at the Caribbean Public Health Agency, reflects on his experience with the Global Health Leadership Program: "[These lessons] helped me structure my thoughts much better. Now I can apply methodology to help me be more effective and efficient in what I do on a daily basis." Both clinical audits and public health surveillance rely on systematic data collection protocols. They transform observations into quantifiable evidence.


Methodological structure establishes the rigour required for practitioner findings to inform policy. It's not about institutional credentials. Systematic documentation protocols transform routine clinical observations into findings that meet peer-review standards. This approach shows that credibility depends on methodological rigour rather than institutional affiliation. As regulatory bodies increasingly incorporate real-world evidence into approval processes, individual practitioners' bounded projects anticipate broader shifts towards sustained longitudinal research.


Collaborative Networks as Research Infrastructure

Individual practitioners can handle bounded research projects just fine. But generating evidence at population scale? That's a different story entirely. Longitudinal studies need data governance, ethical oversight, collaborative protocols, and sustained funding. These are resources way beyond what individual clinicians can manage between patient appointments.


Collaborative networks establish data governance frameworks that standardise collection methods across multiple sites. They create ethical oversight protocols ensuring participant protection throughout extended timelines. They maintain sustained funding mechanisms that keep studies running across budget cycles and personnel changes.


This requires collaborative research networks that provide the institutional framework for sustained data collection and analysis. Dr Martin McNamara, CEO of The Sax Institute since January 2023 and Chief Investigator of the 45 and Up Study, illustrates one example of this approach. The 45 and Up Study tracks over 250,000 Australians since 2005, making it one of the largest longitudinal health studies globally. The Institute's mission integrates research into policy and service delivery through collaborative networks among research and policy agencies, particularly addressing chronic disease prevention.


You can't exactly ask individual doctors to manage quarter-million participant datasets.


Compare Denniss's two-year audit during a five-week project with the 45 and Up Study's twenty-year duration and participant scope. The infrastructure requirements are worlds apart. No individual clinician could maintain such data collection while fulfilling clinical duties. Instead, practitioners contribute through established protocols within collaborative networks.


Collaborative research networks form the second infrastructure layer. They let individual practitioners contribute to longitudinal studies at scales they can't sustain independently. This architecture connects frontline professionals to sustained data collection systems, proving that evidence generation beyond isolated case studies requires purpose-built institutional frameworks.


Image source



When Technology Does the Heavy Lifting

The move from manual to automated data collection marks a massive shift in evidence generation. Healthcare practitioners like Denniss manually retrieve files for audits, but technology platforms now automate systematic data capture across sectors. Manual documentation limits scale and introduces inconsistencies through human variation in recording detail and timing. Automated systems capture data continuously across operations without requiring practitioners to document separately from their core work. This creates datasets large enough for statistical analysis while letting practitioners focus on operational responsibilities rather than research administration.


This requires technology platforms that systematically capture operational data at scale for research analysis. Zig Serafin, based in Seattle and currently Vice Chairman and Special Advisor at Qualtrics, provides one example of this approach. As former CEO, he focused on enhancing the Qualtrics XM platform and working on artificial intelligence (AI) innovation. Qualtrics XM enables professionals to systematically collect experience data—customer interactions, employee feedback—for analysis using research frameworks. It's like having a research assistant embedded in every customer touchpoint.


Experience management platforms continuously capture interaction data at scale. They let practitioners query datasets reflecting millions of activities without manual documentation. These systems transform routine work into analysable evidence across sectors like retail and manufacturing.


A study conducted across all seven geographical regions of Indonesia shows how AI-based clinical decision support systems enable practitioners to generate better clinical evidence. The research involved 102 doctors and showed a 27% improvement in cardiovascular risk assessment accuracy and a 29% increase in appropriate statin prescriptions through AI-assisted evaluation. The technology platform captured and analysed clinical data at a scale no individual practitioner could maintain manually, transforming routine cardiovascular assessments into evidence about optimal prescribing patterns.


Experience management platforms represent the third infrastructure layer – technology systems that systematically capture operational data at scale. This extends practitioner evidence generation from healthcare's manual reviews to automated collection across sectors. The shift from research consumer to evidence generator becomes scalable when technology transforms routine transactions into analysable datasets.


When Methods Matter More Than Degrees

Once technology enables evidence generation at scale, the question becomes: how does practitioner-generated evidence gain credibility? The answer lies in meeting explicit methodological standards set by regulatory frameworks, not institutional pedigree.


For decades, academic gatekeeping worked like an exclusive club. Right university, right supervisor, right credentials. Now? The core principle is that credibility depends on methodological rigour rather than researcher credentials. We're talking systematic data collection and transparent analysis.


What does rigour actually mean? It includes systematic sampling that eliminates selection bias. Transparent analysis protocols that other researchers can replicate. Reproducible methods documented in sufficient detail for independent verification. Plus explicit acknowledgment of study limitations.


Sounds straightforward until you realise rigour actually demands this exhaustive list of requirements.


Regulatory bodies assess whether these standards are met regardless of whether evidence originates in academic laboratories or operational settings. They focus entirely on methodological quality rather than institutional affiliation.


Denniss's research gained credibility through peer-review publication by meeting rigorous standards during her clinical work. McNamara's 45 and Up Study reflects institutional-scale application with its longitudinal design and participant scope. Serafin's platforms enable practitioners to generate evidence meeting standards by automating systematic data capture.


Regulatory validation frameworks establish that practitioner-generated evidence achieves acceptance through meeting methodological rigour previously associated only with academic research. This shows that 'who generates evidence' matters less than 'how it's generated.' The result? A shift from academic exclusivity to methodological accessibility.


Learning Research While Doing the Day Job

Understanding research principles doesn't automatically translate into generating evidence. There's a gap between knowing methodology exists and actually applying it. Practitioners need structured training in methodological application alongside organisational support providing time and resources.


The gap between conceptual understanding and practical application is substantial. Practitioners need training in systematic sampling techniques to avoid selection bias. They need data quality assessment to identify collection errors. They need statistical interpretation to distinguish signal from noise. They need ethical protocols to protect participants.


This all sounds perfectly reasonable until you actually try doing it while managing your regular workload.


Organisations must allocate protected research time – dedicated hours where evidence generation is the primary responsibility. They must provide methodological mentorship from experienced researchers who can guide study design and analysis. Without these supports, practitioners can't bridge the theory-practice divide.


Structured training frameworks enable practitioner skill development by embedding methodological principles into everyday practice. The Curriculum for Wales framework shows this approach through practitioner-led curriculum innovation in education using a collaborative process approach.


Research on this framework has shown increased pupil engagement and professional fulfilment among teachers in Wales during 2024–2025. This practitioner-led curriculum making aligns with the broader trend of professionals becoming active generators of evidence by applying structured methodologies within their fields.


Different approaches require different competencies. Individual methodology like Denniss's systematic approach requires learning analytical frameworks. Institutional networks like McNamara's collaborative model require understanding how to contribute to sustained studies. Technology platforms like Serafin's automated systems require competency in querying datasets.


The Knowledge Power Shift

The old academic-to-practitioner pipeline worked like this: researchers investigated, published findings, practitioners implemented. Simple, hierarchical, one-way. Infrastructure layers now enable a bidirectional flow where practitioners generate evidence academics can't access.


Denniss's research quantified healthcare budget allocation directly from clinical reality. McNamara's institutional capacity enables longitudinal studies beyond individual academic grants. Serafin's platforms capture operational data across sectors impossible through manual documentation alone.


Regulatory validation confirms this bidirectional model. Real-world evidence gets incorporated into approvals by regulators such as the Food and Drug Administration in the United States. Training programs like structured approaches show systematic development of practitioner competencies required for this model.


Professional education must evolve to include research methodology as a core competency. Organisations must protect research time through resource allocation restructuring.


This means formally allocating specific hours for evidence generation within job descriptions and workload planning. You can't expect practitioners to conduct research as additional unpaid work beyond operational responsibilities. Technology systems should prioritise evidence generation capabilities alongside operational efficiency.


While infrastructure enables practitioner evidence generation, it also risks poor methodology at scale. Governance through regulatory and peer-review standards remains crucial as quality filters determining credible practitioner evidence. Peer review and regulatory standards filter quality by rejecting studies with inadequate sampling, undisclosed conflicts, or unsupported conclusions. This prevents the proliferation of poorly designed practitioner studies that could undermine confidence in operationally generated evidence.


Transforming Knowledge Production

The incorporation of real-world evidence represents a structural shift where valid knowledge can originate from documented operational experience – when practitioners apply methodological rigour. We've moved from a world where only universities could generate credible research to one where frontline professionals produce evidence that regulatory bodies accept.


What do practitioners actually need? They need systematic documentation protocols like Denniss's approach. They need institutional frameworks supporting sustained research like McNamara's networks. They need technology platforms automating data capture like Serafin's systems.


But that's not enough. They need explicit validation standards establishing credibility criteria and training systems developing competencies alongside operational expertise. Organisations must restructure work allocation to support practitioner-led research.


The core principle isn't dissolving academic standards – it's expanding who can meet those standards. Methodological competence determines whether evidence shapes industry practice. Systematic approaches enable practitioners to generate insights academics simply can't access. They quantify operational realities and document frontline challenges that controlled studies miss.


Who would've thought the people actually doing the work might have something valuable to say about it? Turns out, they've had plenty to say all along – they just needed the right tools to say it properly.