Evaluation Scotland Wales
The UK Strategy for Financial Wellbeing is taking forward the work of the Financial Capability Strategy Opens in a new window


Evaluation Of Penda: A Financial Empowerment App

Evidence type: Evaluation i

Description of the programme:

Penda (its name taken from an Australian native plant that represents renewal and hope) is Australia’s first app for women who have experienced domestic and family violence (DFV) and are thinking about separation and divorce. Penda is a free, easy to use app with safety, financial and legal information and referrals for women who have experienced DFV. While Penda is not intended to replace key services such as domestic violence support services, lawyers or the police, it aims to increase users’ knowledge and connect them with relevant services.

The study:

Ithaca Group was commissioned by the Women’s Legal Service Queensland to evaluate the Penda app. The evaluation had three main purposes:

  • To inform the design and development of Penda;
  • To assess its usefulness, safety and user-friendliness;
  • To draw conclusions relevant to future financial literacy programmes via apps or programmes for women affected by domestic and family violence.

The evaluation took place between May 2016 and June 2018. This report represents the final report of four published over the lifespan of the evaluation. There were four components to this evaluation, the first three of which were formative and conducted while working alongside the Women’s Legal Service Queensland. The fourth stage occurred after the app had been live for six months and aimed to draw initial conclusions regarding the success of the app. The four stages comprised:

  1. A non-exhaustive review of relevant literature to help identify the main topics for inclusion in the app; validate the evaluation methodology and identify considerations for future research in this field.
  2. ‘Wireframe testing’ which included working with survivors of DFV as well as service providers to obtain feedback on the ‘wireframe’ (a draft version of the app). The methods used included in-depth interviews with DFV survivors and one small focus group, as well as stakeholder feedback.
  3. ‘Beta testing’ which involved another round of testing to assess how user friendly the app was, and how to make it as safe as possible for DFV survivors to use. Respondents were sent the beta version, followed by telephone interviews after a few weeks.
  4. The final app evaluation comprised:
    • Usage patterns from Google Analytics;
    • Responses to an in-app feedback form from app users;
    • Telephone interviews with service providers;
    • Other in-app feedback;
    • App store reviews.

The remainder of this summary deals with the findings from the final, summative evaluation.

Key findings:

  • Google Analytics showed that the app had been downloaded 5,376 times by June 2018. In total, there were 11,601 sessions logged on the app, with an average duration of 3-4 minutes.
  • There is strong evidence that advertising in public washrooms worked, as this coincided with peaks in downloads of the app (915 users compared with the monthly average of 500 users).
  • On average, three-in-five (60%) users were returning users.
  • General feedback from a small sample of the app users was positive. However, due to the very small numbers of participants it was hard to find definitive themes in the feedback.
  • Unfortunately it was difficult to engage service users, and no interviews were secured. Of the 14 responses the evaluators received to their online survey, typical comments included: ‘relevant information’; ‘easy to use’; ‘very helpful’ and ‘good links to outside resources’.
  • Of 52 user comments received through the app, 25 were positive, 15 provided constructive suggestions, eight reported technical glitches, and four were negative.
  • Limited data from app store reviews suggested a high satisfaction rating among app users.
  • It was not possible to draw conclusions about the viability of the app as an information delivery channel compared to other platforms such as websites, booklets etc. This was due to the low response rate from participants and service providers.
  • The evaluators state that user testing adds enormous value to the development of a product. They also highlight the importance of a targeted communication campaign, and recognise the difficulty in eliciting information from DFV survivors.
  • The evaluators use their evidence to conclude that the app is safe to use, with the caveat that nothing can be completely safe in the context of DFV.

Points to consider:

Methodological strengths and limitations:

  • This evaluation is based on a small number of participants, and therefore more information is needed before using this as a basis to scale-up the intervention.
  • The safety of the participants was (commendably so) the primary concern throughout this evaluation, meaning some data could not be collected.


  • While these findings are from an Australian programme, many of the learning outcomes may be transferrable to a UK context.

Generalisability/ transferability:

  • The evaluation is of significant interest to people who are considering using apps to deliver financial education and information to vulnerable groups, and in particular to those targeting victims of domestic and family violence.

Key info

Programme delivered by
Funded by the Women’s Legal Service Queensland
Year of publication
Contact information

Rachel Healy, Susan Tape, Julie Hopwood; Ithaca Group