MOA Benchmarking’s cover photo
MOA Benchmarking

MOA Benchmarking

Information Services

Brisbane (West), QLD 902 followers

MOA: A connected platform for quality, compliance, risk and improvement across aged, disability and health care.

About us

MOA Benchmarking comprises the largest community of aged, retirement, and disability care providers networked for benchmarking data and quality improvement. MOA's online platform supports members in self-assessing their performance against relevant regulatory standards and benchmarking current performance against past performance and peers. Members receive instant performance insights through our powerful reporting module to assist identification of potential risks and opportunities for improvement. These risks and opportunities are automatically incorporated into a digital Plan for Continuous Improvement. MOA also collaborates with the Department of Health to support residential aged care participation in and submissions for the National Aged Care Mandatory Quality Indicator Program.  The platform is customisable and backed by a team of experts who continuously update our suite of quality tools in response to changing regulations, legislation, and accepted best practices.

Website
http://www.moa.com.au
Industry
Information Services
Company size
11-50 employees
Headquarters
Brisbane (West), QLD
Type
Privately Held
Founded
1999
Specialties
Quality Improvement, Risk Management, Benchmarking, Audits, Compliance and Accreditation, Best Practice Systems, Surveys, Quality Indicators, Performance Indicators, Report and software development, Planning for Continuous Improvement Systems, National Aged Care Quality Indicator Program, and Incident Management

Locations

Employees at MOA Benchmarking

Updates

  • MOA Benchmarking reposted this

    About half of the residential aged care sector participates in the National Quality Indicator Program using MOA Benchmarking’s platform. Of those services, roughly half also use our individual-level survey collection tools, while the rest rely on internal systems or third-party tools for primary data collection. Recently, we had a few questions come through about refusal rates and how they’ve been changing over time. Some time ago I’d reported that refusal rates jumped quite sharply after the first couple of QoL/QCE rounds. Now that Filip Reierson has revisited these data, we see that they’ve largely plateaued. That pattern is shown on the first page. For services where we have individual row-level data, meaning they’re using MOA’s collection tools, refusal rates are noticeably lower than for services using other systems for primary collection. The size of that difference is not trivial. It was certainly bigger than I expected, in the order of 20%. At first glance, there’s no obvious reason why simply using a different collection tool should produce materially lower refusal rates. So we dug a bit further. Where it gets more interesting is when you split the data by how the survey is collected. See page 2. The second chart breaks down proxy responses for services using MOA tools versus non-MOA tools. Services using the MOA tools have a much higher proportion of surveys completed by proxies. Importantly, resident self-completion and interviewer-facilitated responses are virtually the same across both groups. The difference in refusal and non-completion is therefore being driven almost entirely by differences in proxy completion. That raises an obvious question about what’s driving the difference. “Non-MOA” covers a wide range of systems and approaches, so it’s unlikely there’s a single explanation. But whatever the mix of reasons, proxy participation clearly plays a large role in overall refusal rates. This means that optimally facilitating proxy responses properly has a real impact on overall participation. And at the scale we’re talking about, that matters. These analyses are based on around 100,000 surveys in a single quarter, roughly 50,000 in each group. A difference of 10 percentage points at that volume is not marginal. It materially changes the completeness and representativeness of the data. For me, the takeaway isn’t just about refusal rates. It’s a reminder that seemingly small design and workflow decisions in data collection systems can have very large downstream effects on what we end up measuring, and who gets counted at all. If you want to go deeper into the detail, Filip has written up the full analysis here (🔗 https://lnkd.in/gZm4m2xi).

  • Did you know? One small but practical feature of our Feedback & Complaints system is how responses are handled. When teams reply to feedback or complaints, they can either use organisation-defined response templates, or, where a template doesn’t exist, use AI to draft a response that staff can review and edit before sending.   It’s a small thing, but it makes day-to-day complaints handling a lot easier to manage.   And while this post is about response workflows, it’s worth saying: the system itself is fully compliant with regulatory requirements, including whistleblower provisions in the Act.    Speaking of requirements, MOA Benchmarking and Health Metrics team members Riona Cusack and Clare Stronach attended this week's 'Rights-based complaints handling under the new Aged Care Act' webinar and are pleased to report that our platform ticks all the boxes! See Aged Care Quality and Safety Commission: (🔗 https://lnkd.in/gx8hKSYB)

    • No alternative text description for this image
  • Tomorrow marks Lunar New Year. It is celebrated across many cultures, most notably in China, and also in Vietnam, Korea and other parts of East and Southeast Asia. In Vietnam, the celebration is known as Tết and is the most significant cultural event of the year. Australia’s cultural and linguistic diversity is clearly reflected in aged care. In Filip Reierson's recent MOA analysis on language diversity, we found that among aged care residents born in non-English-speaking countries, 43.2% prefer to communicate in a language other than English. That’s a substantial proportion, and it reinforces why multilingual capability is not a “nice to have” in aged care services. (🔗 https://lnkd.in/gkTgMkfT) The patterns also differ from the general population. While Mandarin is spoken by 2.7% of Australians compared with 1.2% speaking Cantonese, our aged care survey data shows a different balance. Among MOA survey responses completed in either Mandarin or Cantonese, Cantonese accounts for 42.4% compared with 30.1% in the general population. This likely reflects migration history, with many Cantonese speakers arriving in earlier decades and now entering aged care. This is why our survey tools are designed to support clients and residents to respond in their language of preference. Language shapes how people express needs, provide feedback, and experience care, particularly at later stages of life. Lunar New Year is a timely reminder that culturally responsive, language-aware systems are a core part of genuinely person-centred care.

    • No alternative text description for this image
  • How many hours does your team spend preparing reporting for QCAB meetings? For most providers, the issue isn’t whether the data exists. Incidents, quality indicators, improvement activities, and feedback are usually already being captured somewhere. The time sink is manually collating that information into a single QCAB report that meets legislative and governance expectations, typically every six months. MOA’s QCAB report was built to address that problem. It automatically collates the QCAB-relevant information that exists within MOA into a consolidated, structured report, ready to use for meetings. Where providers are using multiple MOA modules, those components are brought together into a single QCAB paper. Where only some modules are in use, the report reflects the information available. The most recent update adds feedback and complaints data to the QCAB report, allowing consumer experience to be reviewed alongside incidents, quality indicators, and quality improvement activity, rather than as a separate or manually prepared attachment. Using MOA modules more broadly doesn’t just improve how those areas are managed day to day. It also reduces the time and effort required to prepare QCAB reporting, with clear downstream efficiency and cost benefits. If you’re an MOA member and unsure whether the QCAB report is enabled for your organisation, or what it could include based on the modules you use, the team can help.

    • No alternative text description for this image
  • We launched MOA’s Feedback and Complaints system to give providers a practical, end-to-end way of managing feedback that supports both day-to-day improvement and regulatory expectations.   The system supports feedback from any source and through multiple channels, including QR codes, website links, email and in-person capture. It also supports anonymous and confidential reporting, aligned with whistleblower protections under the new legislation.   Feedback is managed within a single workflow, supported by dashboards and reporting that help services understand trends and underlying categories of feedback. Providers can also benchmark their feedback profile against other services, allowing them to see whether particular issues are unusually common or rare and to prioritise improvement effort accordingly.   Where appropriate, complaints can be escalated through to root cause analysis, and corrective actions or quality improvement activities can be linked directly into their continuous improvement plan.

    We launched MOA Benchmarking’s Feedback and Complaints system in October last year. After three months and thousands of pieces of feedback, here are a few points that caught my attention from an early look at the data across October to December. 💬 Most feedback is praise. Across the period, a (slim) majority of submissions were praise, around a third were complaints, with the remainder being suggestions. 🥗 Food and catering is the most common feedback category. This is followed by clinical care, staff behaviour and conduct, choice and dignity, and communication and consultation. In every one of those categories, praise was more common than complaints. 📊 Patterns within categories are different. Within food and catering, meal taste attracted the most praise, while food choice and variety generated the most complaints. In clinical care, care planning was both the largest source of praise and the largest source of complaints. 🧓 Feedback mostly comes from residents and their supporters. There is also a substantial contribution from staff and volunteers, along with a smaller proportion from others or those preferring not to be identified. ◼️ Anonymous reporting is used more for complaints and suggestions. This is less common for praise. The system also supports confidential feedback aligned with whistleblower protections, though only a small number of submissions met that threshold. ⏰ Higher-risk complaints are a small proportion. Just over 7% of complaints were assessed as high risk. Around 3% were identified as requiring root cause analysis, and only a small number were considered potentially reportable. Early days, but a useful first look at how feedback is being used in practice.

    • No alternative text description for this image
  • Clare Stronach, our Clinical Quality Officer, is in New Zealand this week meeting with existing clients and organisations interested in our New Zealand program. The program is designed to support aged residential care homes to meet the requirements of the new Ngā Paerewa Health and Disability Services Standard. When combined with our incident management system, risk management tools, and feedback and complaints module, it provides an integrated approach to quality and governance that keeps people at the centre of care. If you would like to learn more about the New Zealand program, you can contact us via www.moabenchmarking.co.nz.

    • No alternative text description for this image
  • MOA brings together the core pieces of quality and risk management into a single, connected system. Incidents, feedback and complaints, audits, risk registers and quality indicators are not treated as separate workflows, but as related signals that inform each other. That makes it easier to identify patterns and emerging risk, respond consistently, and turn what you see in the data into targeted actions. All of this feeds into a structured approach to continuous improvement, where issues are followed through to resolution and corrective actions are tracked, not lost. The platform integrates with existing systems, supports two-way data flows, and is designed to reduce duplication while improving visibility across services.

    • No alternative text description for this image
  • Every quarter, our support team makes more than 1,300 calls to providers to help resolve data issues and improve data quality. That work matters not just for individual submissions, but for the accuracy of benchmarking across the entire MOA community. NQIP season is one of the busiest periods for our team, so feedback like this is genuinely appreciated. Credit to Riona, Clare, Emma, and the MOA support team for the behind-the-scenes work during NQIP season.

    • No alternative text description for this image
  • Incidents are unavoidable in care. Poor visibility, fragmented follow-up, and weak governance are not.   Our Incident Management System is designed to support the full incident lifecycle. Incidents can be recorded as they happen at the point of care, using structured workflows that support accurate classification, including decision support for SIRS. Guided post-incident review helps ensure contributing factors, actions, and outcomes are documented consistently, while configurable notification rules and escalations ensure the right people are in the loop so nothing stalls or quietly drops away.   Dashboards then provide clear visibility of trends and emerging risk signals across services and organisations. Seamless integration with our PCI module and clinical indicator reporting reduces duplication of data entry and supports continuous improvement.   A better way to manage incidents.   For a demonstration, contact the team on 1300 760 209 or by using the contact page on www.moa.com.au. For existing clients, speak with Clare Stronach or Jordan Lawrence to learn more.

    • No alternative text description for this image
    • No alternative text description for this image
    • No alternative text description for this image
  • Sometimes a complaint isn’t just a complaint. A single piece of feedback can contain multiple, and sometimes conflicting, signals. In this example, a family member raises a concern about how staff spoke to their mother, while also providing positive feedback about the food. Treating this as one undifferentiated complaint loses useful information. The issue about staff communication needs to be addressed, reviewed, and closed out appropriately. At the same time, the positive feedback about food should still be captured, reported, and reflected in your data. The parent child feature in MOA’s Feedback and Complaints module allows you to do both. One item of feedback can be separated into linked child items, each with its own category, workflow, reporting, and outcome, while still being managed from a single view. The result is less manual handling, clearer governance, and more accurate reporting. You can see what needs attention, what is working well, and how often both occur, without losing the context in which the feedback was given. For a demonstration, contact the team on 1300 760 209 or by using the contact page on www.moa.com.au. For existing clients, speak with Clare Stronach or Jordan Lawrence to learn more.

    • No alternative text description for this image

Similar pages

Browse jobs