I am preparing for a return visit to New Zealand, working again for Creative New Zealand (CNZ), to visit their clients to carry out Evaluating Databases Audits. I am following on from a project started by my late colleague Tim Roberts, in a process we had originally developed based on our own separate experiences in Australia and the UK, shaped in discussion with Helen Bartle of CNZ.
Both consultants and arts marketers have to work in the real world, and Tim and I kept asking similar questions after visiting and talking to practitioners:
- Why don’t managers know about the data the staff of their organisation collects?
- Why don’t staff understand the basics of data capture and building useable customer records?
- Since Data Protection is at the heart of collecting and using customer data, why do so many organisations not know about the details of the law?
- Since staff work with “databases” each day, why do they get confused about “lists” and apparently create multiple “lists” with duplicate and non-synchronised records?
- What are the real hurdles to staff joining up their data, synchronising between applications, working with a 360 degree view of their customer inter-actions?
- What leads organisations, that could install a comprehensive software solution for ticketing, marketing, CRM, fund-raising, friends, loyalty and membership schemes, to instead choose separate and unconnected solutions?
- Where do the issues come from that means data usage is often dysfunctional in organisations, sometimes working against its interests?
- Where do Boards and Senior Managers sit in the responsibility for ensuring competent practices in their organisation (and using data in their decision-taking)?
- How do we intervene to help organisations help themselves to work less painfully and definitely smarter?
- How do we address – long term – the creation of professional standards and the challenges from staff turn-over and loss of institutional memory?
Whoa, you might say, that’s heavy.
Whoa, you might say, that’s heavy. But the questions haven’t changed in 20 years. And the questions are similar from Arts Council staff across the UK, other consultants, and of course some of the software solutions providers who wonder just what is going on in their users’ organisations.
The Evaluating Databases Audits are a relatively simple process:
- We start with a briefing conference call to talk through the process and for example get the “key dimensions” of an organisation (you may not be surprised how often they don’t know them) and to identify the key staff who need to be involved (you may not be surprised how often they don’t know who they should be). The basis of the audit is that senior managers must be involved. In practice Artistic Directors, General managers, CEOs – it changes the dynamic of the whole thing.
- We start around 9.45/10am with a group session around ‘mission’ and ‘vision’, trying to understand where the organisation is going and what its data needs might be. We usually invite a Board member to attend (sometimes CNZ makes this a condition of funding the audit). This usually proves a rare opportunity for an internal discussion about translating their ‘mission’ and ‘vision’ into reality and the challenges they face in achieving it, often of staffing, resources and internal competences, let alone all the other factors.
- Hopefully after 11am we get into the actual audit, ideally getting to go round the organisation and sit with staff at their screens and see what they are using, what data they have, and what they are doing with it and using it for. This is usually where we hit the confusion about “lists”, and apparently separate multiple “lists” with duplicate and non-synchronised records, often on separate software tools, often without interfaces. And different functions not being aware what other functions in their organisation are using. In this stage of talking closely with staff we usually find the internal challenges, levels of understanding and competence, and how hard or easy it is for staff to get what they need to do, done. Issues of data quality, data capture, Data Protection, data processing, emerge quite quickly. But this is also a huge knowledge sharing session, filling in gaps in experience, explaining laws and practices, pointing out what is possible, briefing about audience development practice.
- By lunchtime we have usually identified the key points the organisation has to address and we offer to explore these and their implications with a senior manager over lunch. On many occasions they ask that this stage also involves the rest of their staff in a group, which we prefer, though it is challenging not to upset some people in the process. The ones who get most discomforted are those who have toiled at their task without being told there were better/easier/quicker/more accurate/friendlier processes/tools. You can imagine the brainstorming that happens as people identify what could be done to address the points and see some quick fixes as well as much bigger and harder issues.
- We go into the afternoon by building an Action Plan with tasks allocated and a simple timetable for implementation. This often involves proposed changes in software solutions and the procurement of new systems. There will be those daunted by what is ahead and what they may personally need to do, and we have to address their concerns. But we try and keep it focussed on the benefits of change. After long years as a consultant, I know change can be painful, so my first rule is that pain needs to be over in as short a time as possible, and if you can find an accelerator, stamp on it.
- The day ends with a de-brief, which again many senior managers want all the staff involved in the process to be present for. And again, the Board member in at the beginning often returns. So often this concludes with a much clearer understanding of how data is going to make a difference to their working lives and the success of their organisation. And staff feeling much clearer about both what is possible and what it is for.
The Audit is followed on by a written report, shared with CNZ. And CNZ often comment on how that report goes way beyond what you might expect from an Evaluating Databases session, as it reflects the wide-ranging ‘vision’ and ‘mission’ discussions we had. The good news is they usually invite the organisations to make an application for funds towards implementation, though that does focus more on the implementation of database improvements.
I am working in New Zealand with David Martin and Michelle Gallagher again this trip, as I am mentoring them to pick up the baton and carry on the work. They both have experience as arts managers before acquiring their data geeks status, able to work inside arts organisations on audience development and so on, as well as inside the software. They have got experience of a number of different ticketing systems, with PatronBase and Tessitura knowledge under their belt, and both work ‘hands on’ with arts organisations as the core of their work. Like my UK colleague Andrew Thomas, they bring a weight of experience alongside their technical skills, forming that 360 degree holistic view of what is really going on.
I think I see this making a difference
I think I see this making a difference, which is why I give priority to it. And those questions persist that Tim and I asked in the past. Indeed I sometimes worry that the situation is getting more challenging under the pressure of reduced funding for the arts across the world. The UK’s Audience Agency (of which I am now a Board member) supplies the Audience Finder tools to Arts Council England NPO/MPMs and in the summer of 2016 Firetail carried out a survey of the users; astonishingly 42% of survey respondents indicate that their organisation does not have the skills and capacity to make full use of Audience Finder, with a further 6% not knowing whether they did! That suggests there is still a lot to be improved. It begs the question of whether we need an Evaluating Databases Audit project in the UK too. Not conducted by me, I hasten to add…
New Year 2017