The Evaluation Hub works by asking community members, stakeholders and project teams a series of 20 standardised questions that have been tested and reviewed by independent experts.
From providing a snapshot on your projects strengths and weaknesses to giving you greater insight into the views of both the project team and the participants, the Evaluation Hub improves industry performance by identifying and articulating lessons learned and achievements.
The 20 questions explore:
Using a series of heatscales, the Evaluation Hub avoids providing any single rating to a project. Based on years of research, the system has been designed to provide information to in-house practitioners and consultants to use in their own work.
The final reports, which are automatically generated, will allow organisations to compare their performance across several criteria depending on their subscription level including:
It begins with all participants (community members, stakeholders and project teams) answering the same set of 10 standardised questions. By using standardised questions, the Evaluation Hub provides a structured way of assessing engagement, allowing for continuous improvement.
Most of the questions are based on a 9-point scale with some allowing for open-text responses. This is the Standard Level and is perfect for those involved in a one-off project. the Evaluation Hub offers two other levels which allow you to add additional questions which may be selected from a ready made list or your own.
These questions were developed through a lot of hard work and reflection with the Evaluation Hub engineers spending months and months reviewing academic research and large-scale evaluation projects as well as asking some of the best and brightest minds in the industry.
Established in 2016 by leading communications and engagement practitioner Amanda Newbery, the Evaluation Hub is a public participation advisory service that works closely with key stakeholders to advance and extend public participation through the use of standardised benchmarks that meet the global quality assurance standards.
Over the next few months, an advisory group of experts will be established to guide the evolution of the Evaluation Hub. The advisory group will work closely with Amanda to ensure the Evaluation Hub is turned into a finished service.
Amanda will exclude herself from seeing results from any project, unless requested by an account holder, and will instead have the genuine joy of analysing aggregate results across the industry.
The world’s first mobile evaluation system for community and stakeholder engagement has been designed by long-term supporter, volunteer, trainer and former board member of the International Association of Public Participation Australasia (IAP2A) Amanda Newbery.
Widely respected for thinking outside the square, Amanda sits on several boards where she provides valued strategic brand and stakeholder engagement insights.
As a trainer for a decade, Amanda has spoken first hand to hundreds of engagement experts, decision makers and project teams, and understands that if the profession is to continue to grow, more work needs to be done on evaluation. But it needs to be easy.
Rob Gravestocks has steered Evaluation Hub from being a twinkle in Amanda’s eye to the online tool you can see and use today.
Rob has more than 20 years’ experience in stakeholder and community engagement, marketing and communications.
He honed his knowledge of community and stakeholder engagement at IAP2 where he managed and coordinated the training arm of the organisation.
Rob also project managed the new Australasian Certificate in Engagement for IAP2.
123 Charlotte Street
Brisbane Qld 4000
PO Box 15026
City East Qld 4002