Skip to main content

Monitoring and assessing the quality of care for youth: developing an audit tool using an expert consensus approach

Abstract

Background

The mental health needs of young people are often inadequately met by health services. Quality improvement approaches provide a framework for measuring, assessing and improving the quality of healthcare. However, a lack of performance standards and measurement tools are an impediment to their implementation. This paper reports on the initial stages of development of a clinical audit tool for assessing the quality of primary healthcare for Australian Indigenous youth aged 12–24 including mental health services provided within primary care.

Methods

Audit items were determined through review of relevant guidelines, expert reference group consensus opinion and specific inclusion criteria. Pilot testing was undertaken at four Indigenous primary healthcare services. A focus group discussion involving five staff from a health service participating in pilot testing explored user experiences of the tool.

Results

Audit items comprise key measures of processes and outcomes of care for Indigenous youth, as determined by the expert reference group. Gaps and conflicts in relevant guidelines and a lack of agreed performance indicators necessitated a tool development process that relied heavily on expert reference group advice and audit item inclusion criteria. Pilot testing and user feedback highlighted the importance of feasibility and context-specific considerations in tool development and design.

Conclusions

The youth health audit tool provides a first step in monitoring, assessing and improving the way Indigenous primary healthcare services engage with and respond to the needs of youth. Our approach offers a way forward for further development of quality measures in the absence of clearly articulated standards of care.

Background

Youth health services

Youth is a critical period of development and transition in which behaviours and habits can be established, experimentation and risk taking may take place and mental health problems can emerge. From the perspective of health services, this important life phase presents an opportunity for facilitating healthy lifestyles and identifying and mitigating risk factors for future ill-health [1]. However, there are many barriers preventing young people from accessing health services, such as healthcare availability, accessibility, acceptability and equity [2].

The role of primary healthcare

Primary healthcare (PHC) provides the foundation of healthcare systems internationally through key roles in prevention and early intervention [3, 4]. In some remote or developing contexts, PHC services may be the main or only healthcare providers. The frequent comorbidity of depression with substance misuse [5] and chronic disease [6] makes PHC an ideal avenue for prevention and management of these conditions. Amongst PHC services internationally, there is a need to establish and monitor standards of care for youth [2].

Australian Indigenous youth health

Young Indigenous Australians have a higher burden of disease than other Australian youth, largely attributable to high rates of mental illness such as anxiety and depression, substance use and injuries [7]. In comparison to their non-Indigenous counterparts, Indigenous young people also suffer from substantially higher rates of chronic conditions, including diabetes, hearing loss, skin diseases, rheumatic heart disease [8], poor sexual health [7] and experience considerable health challenges across a range of other indicators [7].

Healthcare standards for Australian Indigenous youth

Indigenous youth under-utilise health services and engage with healthcare at more advanced stages of illness and for shorter periods in comparison to non-Indigenous youth [9, 10]. The quality, capacity and cultural appropriateness of health services have been recognised as key factors in engagement with services and in health outcomes for Indigenous youth [7]. However, despite the different health profile of Indigenous youth and unlike some other areas of Indigenous health, systems for monitoring and improving the performance of Indigenous youth health services are limited. There is no framework specific to Indigenous youth and the indicator framework for Australian youth health does not provide data in a format that readily facilitates health system improvements, combining primary healthcare and hospital data. Minimal data are available to monitor standards at jurisdictional, regional or healthcare centre levels [11].

A need for data to help facilitate health service system improvements which are locally-relevant, culturally appropriate and responsive to the needs of Indigenous youth has been identified from within the Indigenous primary healthcare sector [12]. We were not able to identify any tools for monitoring the performance of primary healthcare services in Indigenous youth health in a systematic search. Our aims were therefore to develop an audit tool for monitoring and assessing the delivery of health and wellbeing services for Indigenous youth for use within a national quality improvement program.

Quality improvement as an approach to improving service delivery

Modern continuous quality improvement uses objective information to analyse and improve health systems, processes and outcomes [13, 14]. Through our Audit and Best Practice for Chronic Disease projects, we have used a quality improvement approach to develop tools and processes to facilitate cycles of feedback reports and workshops, goal setting, action planning and implementation of systems changes within Indigenous primary healthcare services (Figure 1) [15]. Over 230 Indigenous health centres across Australia have participated in evidence-based quality improvement processes under the auspices of One21seventy—the National Centre for Quality Improvement in Indigenous Primary Health Care. The One21seventy approach to quality improvement has been reported elsewhere [1618].

Figure 1
figure 1

One21seventy quality improvement cycle.

Quality improvement indicator development

Donabedian [19] provides a theoretical approach to measuring quality of healthcare based on structures, processes and outcomes of care, which has been generally accepted by quality improvement practitioners. Traditionally, the development of quality indicators has involved a combination of evidence-based guidelines and expert consensus. However, few published pilot and validation studies of audit tools contain explicit information about the methodology used to develop indicators or the assessment criteria used to evaluate them [20].

In a systematic review of guideline-based quality indicator development methods, Kötter, Blozik and Scherer [21] found that no gold standard for developing quality indicators exists. Two broad approaches to identifying indicators from guidelines were identified: the extraction of all guideline recommendations and the selection of indicators using a systematic expert consensus approach; and the selection of a limited number of guideline recommendations as the basis for indicators. In studies where a limited selection of guideline recommendations were extracted, criteria for preselection included the size of the impact on patient health; the relevance to obtaining value for money; the importance to quality of healthcare provided; the feasibility of monitoring; the level of evidence; the grade of the recommendation; and measurability.

According to Baker and Fraser [22], quality indicators should be determined according to the strength of the evidence behind best practice guidelines and their potential influence on outcomes of care. They should include diagnostic and management indicators and establish realistic standards of care; however this approach does not take into account the burden of disease or stakeholder engagement. In contrast, Bickman [23] has advocated an approach based on program logic, involving the mapping of all stages of health service delivery processes to develop process-based quality indicators. The interaction of this method with evidence-based practice is unclear.

Aims

This paper describes the initial stages of development of a tool for auditing primary healthcare services for Indigenous youth aged 12–24. The audit tool is designed for use by clinical and non-clinical primary healthcare service staff to audit client records and includes scheduled mental health services provided within primary care. Specific objectives are to describe:

  • The development of audit items using a systematic methodology of identification of relevant items from guideline review and expert consultation; and

  • How results from pilot testing and a user feedback process informed the further development of the audit tool.

Methods

Our methodology is consistent with an approach described by Kötter et al. [21] where standards of care have not been pre-defined. In this method, evidence-based guideline recommendations are identified and audit items developed using a systematic expert consensus approach and explicit selection criteria. The youth health audit tool was developed through a process that considered factors relevant to the health system or population and health centre levels (Figure 2). An Expert Reference Group (ERG) was formed to advise on the development of a draft document. Findings from pilot testing and a user feedback process informed further refinement in addition to consultation with the ERG.

Figure 2
figure 2

Audit tool development methodology.

Establishment of expert reference group

The ERG determined content validity of the tool [24] based on expert consensus opinion. The ERG comprised expertise in Indigenous youth health from primary healthcare services and research organisations with representatives from four states and territories. The group met on a monthly basis over an 18 month period via teleconference and was facilitated by a member of the research team. Terms of reference for the group were developed and agreed, including standard One21seventy criteria for determining audit items (with minor changes for the Indigenous youth population). Initial inclusion criteria were based on the importance of the health issue for the Indigenous youth population; effectiveness and appropriateness of the assessment, intervention or service; and ‘auditability’ or expectation of finding relevant information in client records and reporting on the item (Figures 2, 3).

Figure 3
figure 3

Criteria for determining audit items for the delivery of primary healthcare to Indigenous youth.

Group facilitation techniques were used to achieve a consensus amongst ERG members. In instances where members disagreed on the inclusion of a specific item in the audit, all members were invited to outline and justify their views. Generally, disagreements were resolved with reference to the inclusion criteria, and particularly the need to satisfy the ‘auditability’ clause.

Development and refinement

Relevant guidelines for Indigenous youth health were identified (Table 1) and used as a basis for development of initial audit items. For example, a national Indigenous preventive health guidelines recommendation to assess smoking status as part of a periodic health check provided the basis for an audit item to measure the proportion of clients with assessment of smoking status over the previous 24 months [25].

Table 1 Guidelines for youth health in Indigenous primary healthcare

Pilot testing

The performance of the tool was pilot tested at four health centres in urban, regional and remote communities in the Northern Territory, South Australia, Western Australia and Queensland. The health centres included a range of organisation sizes and governance arrangements. At each pilot site, 30 client records were audited by trained auditors and health service staff. Pilot testing aimed to assess the recording in client records of specific items of care and the ease of use of the audit tool within different paper-based and electronic patient information systems. Notes and feedback from auditors were used to assess performance.

User feedback

Staff taking part in the pilot at one site participated in a semi-structured focus group discussion to explore user experiences of the tool. Staff at this health service were undertaking a number of complementary projects relating to Indigenous youth health and two were represented on the ERG. Focus group participants were required to have completed at least one youth health audit using the audit tool. Purposive and maximum variance sampling [41] were employed to provide representation of a range of roles and organisational levels. Five staff including Indigenous and non-Indigenous participants (a manager, two project officers and two quality improvement coordinators) took part in a discussion of approximately 1 h duration at the health service. The focus group was facilitated by the first author using an interview guide. Consent was obtained from all participants through a written agreement. The focus group discussion was audio recorded, transcribed and distributed to participants for checking and confirmation. Results were reported to the ERG and contributed to further refinement of the audit tool. Ethics approval was provided by ethics committees in each jurisdiction where the tool was piloted and qualitative data was gathered.

Qualitative analysis

Following a period of reflection, data were broken into units of meaning and initial themes were identified and discussed by two members of the research team. Initial manual data coding was conducted and the frequency of each theme’s appearance and its distribution across the range of participants determined to identify major and minor themes. Initial themes were then re-assessed and final themes were agreed by the research team and applied to the data through coding. A process of thematic analysis was undertaken to consider and interpret findings. Participants were consulted about initial findings and given the opportunity to contribute to further analysis through circulation of a draft manuscript.

Results

Audit tool content

The audit tool comprises items under the headings: general information; attendance at health centre; recording of key health information; scheduled immunisations; protective factors, risk factors, brief interventions and referral; scheduled services and follow up. A key component is a section on psychosocial concerns based on the HEEADSS (Home environment, Education and employment, Eating and diet, Activities and peer relationships, Drugs and alcohol, Sexual behaviour risks, Suicide and emotional wellbeing) assessment, an internationally accepted youth psychosocial assessment tool recommended by national guidelines for Indigenous youth [25, 41]. Cultural engagement and gambling were added to the HEEADSS items following recommendation from the ERG. Other items added to the audit tool following ERG advice were assessments of mature minor status and client consent. Audit items assess processes of care (including documentation of key client information), or a combination of processes and outcomes (for example, combining assessment of client risk status and the result of the risk assessment) (Table 2); assessment of the structure of care is dealt with separately in the One21seventy process through systems assessment (Figure 1).

Table 2 Final set of audit items in the youth health audit tool

The audit tool underwent four major phases of development as the ERG considered an initial draft and results from pilot testing and user feedback processes (Figure 4). Time taken to complete the audit and user feedback on the length of the tool provided impetus to further prioritise audit items. The original One21seventy criteria for determining the inclusion of audit items (Figure 3) required refinement for the purpose of the youth health audit tool. Further prioritisation of audit items was determined by the ERG according to best practice service delivery for the prevention, screening or treatment for risk factors and conditions identified as priority issues for the health and wellbeing of Indigenous

Figure 4
figure 4

Audit tool development process.

youth and considerations of what could be reasonably expected of health service practice and documentation, in addition to the original criteria (Figure 2). As a result, some psychosocial items were included in a briefer format and grouped as a ‘lifestyle discussion’, which included home environment, education, employment, activities and peer relationships, diet and eating, gambling and cultural engagement; and in the scheduled services section, items relating to follow up of abnormal findings were grouped as ‘actions taken’. Some other items that were not well documented in client records and which the ERG advised were unlikely to be specifically documented were removed, including discussion of contraception and physical development.

Gaps and conflicts in relevant guidelines for Indigenous youth health and a lack of agreed standards of care necessitated revision. In the final version of the tool, the ERG agreed to follow national Indigenous preventive health guidelines [25]. A lack of standard terminology of components of care delivery, particularly relating to psychosocial issues such as brief interventions and social or cultural treatments required further consideration, discussion and advice from the ERG and specification in the protocol.

Pilot testing

The audit tool was generally found to be straightforward to use within different paper-based and electronic patient information systems. Ease of use was facilitated where health services had adopted standard ways and locations for recording information in client files across practitioners. It was less time consuming to complete in health services with standard recording protocols, in services which stored client information only in electronic systems (without supplementing with paper-based records) and where electronic systems included a specific tab for recording psychosocial information.

User feedback

Scope and content

A major theme of the focus group discussion was the scope and content of the audit tool. While there was a consensus amongst participants that the audit tool was not able to measure client engagement, there was discussion about the need to broaden the scope to determine whether unmet standards of care reflected client choice or quality of the service offered, as illustrated:

… so what are you trying to measure, the uptake or also the quality of the care being offered [e.g.: for STI screening], so you’ve got a tick for being offered, but we also know they’re not being screened and if you see, you know 50% of people have… refused it, is it because of the way they’re being offered, so do we need to look at how people are asking/offering that thing, or is it because there is some other thing happening in the community where they don’t want to be part of that?

Participants discussed the scope of the audit eligibility criteria, disagreeing on whether including clients with chronic conditions would bias results. Feedback on the content of the audit tool included comments on the addition of items (main health service attended, chronic conditions care plan), amendments to items (separating diet and physical activity, including internal referrals in referral items) and the deletion of items (medication for eating problems, recording of client information in paper records). At the same time there were some general concerns expressed about the length of the tool.

Potential implications for service delivery

The ways in which audit data may inform service delivery was a central aspect of the discussion. Participants discussed how data on clients not attending appointments or declining treatment or referrals may lead to further investigation:

That [clients not attending referral services] might have to be something that as a service we look at…

The potential for data on reports received from referral services to lead to greater communication with external organisations was also discussed:

…there’s very low communication that exists, so that’s what’s designed to help highlight and then you can help prioritise for yourselves if that’s what you want to target, help to increase it…

In the context of a move towards electronic systems and shared electronic health records, this was thought to be particularly pertinent.

Participants discussed how audit data could inform strategies for improved service delivery and outcomes for clients with chronic conditions. The potential for the audit to provide information on their general care including psychosocial screening was highlighted, as illustrated:

…it might tell us whether the care was focussed on that condition rather than being, you know, covering all of the other psychosocial areas, because having done some of the audits that’s what I saw…

Quality of documentation in client records

Discussion of the quality of documentation in client records identified some challenging issues. Although it was acknowledged that appropriate documentation was a component of best practice care, there was a concern that some audit items were not well documented and may be difficult to audit. Inconsistencies in the recording of non-clinical items such as referrals, reports received from referral services and mature minor assessments were identified. The potential for the audit to extend to other databases for the purposes of searching for referral information and immunisations was explored.

Terminology and definitions

Inconsistent terminology in client records was identified as posing a further potential impediment to the audit process. The challenges of auditing psychosocial assessments were highlighted as participants discussed how to define items in the audit protocol. Greater clarity in definitions of brief interventions, reports received from referral services, long term health conditions, up to date health summaries, clients screened at risk, weight management plans and psychosocial assessments was recommended. Participants generally felt that terminology used within the audit tool should not be overly clinical and be consistent with other One21seventy audit tools for ease of use, particularly by non-clinical staff.

Discussion

Tool development and refinement

Consistent with Kötter et al. [21], our process combined extraction of evidence-based guidelines with expert opinion and specific inclusion criteria. An important refinement was our pragmatic approach in seeking advice on the priority audit items and exploring feasibility issues with stakeholders through the ERG and end user focus group processes.

The importance of considering feasibility issues and the need to balance the breadth of items with length were highlighted by both the ERG and user feedback processes. The original One21seventy criteria for developing audit tools (Figure 3) were found to be insufficient for determining items in the youth health audit tool. Further prioritisation of audit items was determined by the ERG according to best practice service delivery for the prevention, screening or treatment for risk factors and conditions identified as priority issues for the health and wellbeing of Indigenous youth and reasonable expectations of health services in the documentation and delivery of care, in addition to the original criteria (Figure 2). ‘Reasonable expectations’ primarily related to the recording of information in client records and the extent of communication and follow up with referral agencies that could reasonably be expected of health services, given resource constraints. It was felt that this approach would enable an adequate assessment of health service performance whilst avoiding setting unattainable standards.

The user feedback process demonstrated the ability of the audit tool to educate health service staff about standards of care and to facilitate reflective practice through both the processes of auditing individual client records and the interpretation of summary results. Some audit items (particularly psychosocial items) were therefore retained in the audit tool despite poor documentation in client records in order to provide opportunities for improvement through staff education and reflection. Appropriate documentation of care is an important component of quality improvement and a legal and ethical issue. Poorly documented items are not graphed in the audit report provided to health services in order to avoid misinterpretation, but presented in tabular format with appropriate commentary on interpretation.

Limitations

For purposes of brevity and feasibility, the audit tool only assesses guideline-specified prevention, screening and treatment for certain risk factors and conditions prioritised by the ERG. It was not possible to establish criterion validity due to the absence of a ‘gold standard’ comparison to our process [21]. User feedback was only sought through a focus group discussion at one health service, but was combined with auditor notes from pilot testing at four sites and interpreted by the ERG which represented a range of health services. Audit results and inter-rater reliability from pilot testing are not reported in this manuscript as they do not directly relate to its scope but may be reported elsewhere. Further testing of the performance and reliability of the tool is needed.

Implications

In the context of no articulated standards of care, it is possible to develop an audit process premised on evidence-based guidelines but the justification for each audit item must be explicit and realistic. Our experience was that engagement with stakeholders and experts and the use of selection criteria for audit items were pivotal in determining key measures of quality; and that a pragmatic approach that considered feasibility and reasonable expectations of health services, given constrained resources and the fragmented nature of referral services, was needed. We would recommend that the future development of such tools incorporate these processes and context-specific considerations while allowing further refinement post-implementation.

We expect that audits of youth health service delivery will encourage a greater focus on youth health amongst primary healthcare services, an area often overlooked, and in particular greater attention to screening, assessment and treatment of psychosocial issues. Results from the youth health audit may be of interest to policymakers in designing and supporting primary healthcare delivery models appropriate for Indigenous youth. As there are currently no clearly articulated standards of care specific to Indigenous youth health, policymakers may be interested in the final set of measures included in the audit tool if sector-wide reporting metrics and performance standards are developed for Indigenous youth health in future. Further research is needed to investigate the implementation of the audit tool and the nature and extent of any improvements to the quality of care for youth following successive cycles of use.

Conclusions

We found that developing robust measures of quality for primary healthcare for Indigenous youth required a pragmatic methodology which was specific, yet flexible. A lack of clearly articulated standards of care, along with gaps and conflicts in guidelines for evidence-based practice, ensured that expert consensus played a fundamental role in determining audit items. Clear criteria were needed to guide expert consensus, but the ability to further refine criteria enabled the Expert Reference Group to respond to issues raised in pilot testing and user feedback processes.

Findings from pilot testing and user feedback processes, that the tool was generally user friendly but impractical in length, facilitated additional prioritisation of audit items by the Expert Reference Group and hence enabled the tool to be further refined. Other feasibility issues raised in these processes, including variable documentation of some services in client records and limits to the ability of practitioners to follow up referral services, highlighted the need for the Expert Reference Group to consider reasonable expectations for health services in documenting and delivering care and avoiding setting unattainable standards, given resource constraints. However, the user testing process also demonstrated how the audit could help educate staff about best practice and provide an opportunity for reflective practice, suggesting a need to balance these functions with the feasibility issues described when establishing performance measures.

To our knowledge, the youth health audit tool represents the first attempt at developing such a resource. Further testing of the performance and reliability of the tool is needed. However, we hope that through incorporation in quality improvement processes such as the One21seventy program, the youth health audit tool will provide a first step in monitoring, assessment and improvement in the way Indigenous primary healthcare services engage with and respond to the needs of youth.

References

  1. World Health Organisation. Strengthening the health sector response to adolescent health and development. Geneva: WHO; 2010.

  2. World Health Organisation. Making health services adolescent friendly: developing national quality standards for adolescent-friendly health services. Geneva: WHO; 2012.

  3. World Health Organisation. The World Health Report 2008: Primary health care now more than ever. Geneva: WHO; 2008.

  4. Si D, Bailie R, Togni S, D’Abbs P, Robinson G. Aboriginal health workers and diabetes care in remote community health centres: a mixed method analysis. Med J Aust. 2006;185(1):40–5.

    PubMed  Google Scholar 

  5. Putt J, Delahunty B: Illicit drug use in rural and remote Indigenous communities. In: Trends and issues in crime and criminal justice. No. 322. Canberra: Institute of Criminology; 2006.

  6. Si D, Dowden M, Kennedy C, Cox R, O’Donoghue L, Liddle H, et al. Indigenous community care: documented depression in patients with diabetes. Aust Fam Physician. 2011;40(5):331–3.

    PubMed  Google Scholar 

  7. Australian Institute of Health and Welfare. Young Australians: Their Health and Wellbeing 2011. Cat. no. PHE 140 Canberra; 2011.

  8. Couzos S, Murray R (eds): Aboriginal primary health care: an evidence-based approach. Sydney: Oxford; 2008.

  9. Price M. Dalgleish J Help-seeking behaviour among Indigenous Australian adolescents: exploring attitudes, behaviours and barriers. Youth Stud Aust. 2013;32(1):10–8.

    Google Scholar 

  10. Westerman T. Engaging Australian Aboriginal youth in mental health services. Aust Psychol. 2010;45(3):212–22.

    Article  Google Scholar 

  11. Australian Institute of Health and Welfare. Health and Wellbeing of Young Australians: Indicator framework and key national indicators. In: Australian Institute of Health and Welfare Bulletin. Canberra; 2010, p 77.

  12. Nori A. Improving health services for indigenous youth in Northwest Adelaide: technical report. Adelaide: Aboriginal Health Council of South Australia; 2006.

    Google Scholar 

  13. McDermott RA, Schmidt BA, Sinha A. Improving diabetes care in the primary healthcare setting: a randomised cluster trial in remote Indigenous communities. Med J Aust. 2001;174(10):497–502.

    CAS  PubMed  Google Scholar 

  14. Graham N. Quality in health care: theory, application and evolution. Gaithersburg: Aspen Publishers; 1995.

    Google Scholar 

  15. Bailie R, Si D, Shannon C, Semmens J, Rowley K, Scrimgeour D, et al. Gardner KStudy protocol: national research partnership to improve primary health care performance and outcomes for Indigenous peoples. BMC Health Serv Res. 2010;10:129.

    Article  PubMed Central  PubMed  Google Scholar 

  16. Schierhout G, Brands J, Bailie R. Audit and Best Practice for Chronic Disease Extension Project, 2005–2009: final report. Melbourne: The Lowitja Institute; 2010.

    Google Scholar 

  17. Bailie R, Si D, O’Donoghue L, Dowden M. Indigenous Health: effective and sustainable health services through continuous quality improvement. Med J Aust. 2007;186:525–7.

    PubMed  Google Scholar 

  18. Bailie R, Matthews V, Brands J, Schierhout G. A systems-based partnership learning model for strengthening primary healthcare. Implement Sci. 2013;8:143.

    Article  PubMed Central  PubMed  Google Scholar 

  19. Donabedian A. Explorations in quality assessment and monitoring Vol. I: the definition of quality and approaches to its assessment. Ann Arbor: Health Administration Press; 1980.

    Google Scholar 

  20. Hearnshaw H, Harker R, Cheater F, Baker R, Grimshaw G. A study of the methods used to select review criteria for clinical audit. Health Technol Assess. 2002;6(1):1.

    CAS  PubMed  Google Scholar 

  21. Kötter T, Blozik E, Scherer M. Methods for the guideline-based development of quality improvement indicators—a systematic review. Implement Sci. 2012;7:21.

    Article  PubMed Central  PubMed  Google Scholar 

  22. Baker R, Fraser R. Development of review criteria: linking guidelines and assessment of quality. Br Med J. 1995;311:370–3.

    Article  CAS  Google Scholar 

  23. Bickman L. The application of program theory to the evaluation of a managed mental health care system. Eval Program Plan. 1996;19(2):111–9.

    Article  Google Scholar 

  24. DeVellis RF. Scale development: theory and applications. California: Sage; 1991.

    Google Scholar 

  25. National Aboriginal Community Controlled Health Organisation and Royal Australian College of General Practitioners. National guide to a preventive health assessment in aboriginal and Torres Strait Islander peoples. 2nd ed. Melbourne: Royal Australian College of General Practitioners; 2012.

    Google Scholar 

  26. Medicare Australia: Medical Benefits Scheme Health assessment for Aboriginal and Torres Strait Islander people (MBS Item 715) http://www.health.gov.au/internet/main/publishing.nsf/Content/mha_resource_kit. Accessed 12 Aug 2014.

  27. Central Australian Rural Practitioners Association. Central Australian Rural Practitioners Association Standard Treatment Manual. 5th ed. Alice Springs: Central Australian Rural Practitioners Association; 2009.

    Google Scholar 

  28. Congress Alukura and Nganampa Health Council Inc. Minymaku Kutju Tjukurpa—Women’s Business Manual. 4th ed. Alice Springs: Congress Alukura and Nganampa Health Council Inc; 2008.

    Google Scholar 

  29. Queensland Health and the Royal Flying Doctor Service Queensland Section. Primary clinical care manual. 7th ed. Brisbane: Queensland Health and the Royal Flying Doctor Service Queensland Section; 2011.

    Google Scholar 

  30. Health Queensland. Queensland sexual health clinical management guidelines. Brisbane: Queensland Health; 2010.

    Google Scholar 

  31. Sexual Health Society of Victoria. National management guidelines for sexually transmissible infections. Melbourne: Sexual Health Society of Victoria; 2008.

    Google Scholar 

  32. Western Australian Department of Health. Guidelines for managing sexually transmitted infections. Perth: Western Australian Department of Health; 2011.

    Google Scholar 

  33. Cusack L, de Crespigny C, Gardner J, O’Connor J, Taylor K, Tovell A, et al. Alcohol Treatment Guidelines for Indigenous Australians. Canberra: Department of Health and Ageing; 2012.

    Google Scholar 

  34. Henry-Edwards S, Gowing L, White J, Ali R, Bell J, Brough R, et al. Clinical guidelines and procedures for the use of methadone in the maintenance treatment of opioid dependence. Canberra: Department of Health and Ageing; 2003.

    Book  Google Scholar 

  35. Lintzeris N, Clark N, Winstock A, Dunlop A, Muhleisen P, Gowing L, et al. National clinical guidelines for the use of Buprenorphine in the treatment of opioid dependence. Canberra: Department of Health and Ageing; 2006.

    Google Scholar 

  36. Royal Australian College of General Practitioners, Department of Health and Ageing, National Aboriginal Community Controlled Health Care Organisation. Interpretive guide of the RACGP standards for general practice for aboriginal and torres strait islander health services. 3rd ed. Melbourne: Royal Australian College of General Practitioners; 2010.

    Google Scholar 

  37. Menzies School of Health Research. Recommendations for clinical care guidelines on the management of otitis media in aboriginal and Torres Strait Islander populations. Canberra: Department of Health and Ageing; 2010.

    Google Scholar 

  38. National Heart Foundation of Australia, Cardiac Society of Australia and New Zealand: Diagnosis and management of acute rheumatic fever and rheumatic heart disease in Australia—an evidence-based review, http://www.heartfoundation.org.au/SiteCollectionDocuments/Diagnosis-Management-Acute-Rheumatic-Fever.pdf. Accessed 12 August 2014.

  39. Commonwealth of Australia: National Physical Activity Guidelines. http://www.health.gov.au/internet/main/publishing.nsf/Content/health-pubhlth-strateg-phys-act-guidelines. Accessed 29 May 2012.

  40. Queensland Health, Royal Flying Doctor Service Queensland Section. Apunipima Cape York health council: chronic disease guidelines. 3rd ed. Brisbane: Queensland Health; 2010.

    Google Scholar 

  41. Chown P, Kang M, Sanci L Newnham V, Bennett DL. Adolescent Health: Enhancing the skills of General Practitioners in caring for young people from culturally diverse backgrounds. GP Resource Kit. 2nd ed., Sydney: NSW Centre for the Advancement of Adolescent Health and Transcultural Mental Health Centre; 2008.

Download references

Authors’ contributions

SP coordinated the ERG, carried out pilot testing, analysed qualitative data and developed a draft version of the manuscript. TN developed the study design, chaired the ERG and analysed qualitative data. VM performed statistical analysis of quantitative data and contributed to audit tool design. DM contributed to audit tool design. RP contributed to audit tool design, contributed to qualitative analysis and carried out pilot testing. AN contributed to audit tool design and qualitative analysis. RB conceived of the study and developed the study design. All authors contributed to the manuscript. All authors read and approved the final manuscript.

Acknowledgements

Members of the Expert Reference Group, the health services and their staff who were involved in audit tool development, pilot testing and participated in a focus group discussion, Audit and Best Practice for Chronic Disease National Research Partnership staff (past and present) including Jo Newham, Jacki Ward, Tania Patrao, Anna Sheahan, Cynthia Croft and our funders: beyondblue and the Lowitja Institute. RB’s work is supported by an ARC Future Fellowship (#FT100100087).

Compliance with ethical guidelines

Competing interests The authors declare that they have no competing interests.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stefanie Puszka.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Puszka, S., Nagel, T., Matthews, V. et al. Monitoring and assessing the quality of care for youth: developing an audit tool using an expert consensus approach. Int J Ment Health Syst 9, 28 (2015). https://doi.org/10.1186/s13033-015-0019-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13033-015-0019-5

Keywords