When we embarked on the transformation of the Cultural Data Profile, we embraced an agile and iterative product development process. We actively solicit and track feedback from our stakeholders and use it to improve the user experience, develop new features, and optimize existing features.
One of our top goals when we re-designed the CDP, based on feedback from our users, was to reduce the time arts and culture organizations spent on data entry. Our users have told us that we have made good progress and they happily report it takes them less time to complete the new profile.
We did many things to accomplish this, including reducing the number of expense lines items by half and eliminating tricky data points that users told us could be confusing and time-consuming.
In our role as data collectors, we strive to balance the effort it takes for our busy arts and culture users to enter data against the accuracy we need to ensure the data’s utility. It’s a challenge.
However, we want to share with you one place where we think we got it wrong, and what we're changing to make it right.
In the previous platform, the survey had a place for audited organizations to enter totals from their audits. An internal error check then made sure that other inputs, when calculated, matched the totals.
This caught a lot of errors in data entry, but we also heard that for many users, it was very time-consuming to go through the process of reviewing and resolving the items automatically flagged as discrepancies by the system.
After careful consideration during our redesign, we originally chose to eliminate the requirement that users input the totals from their audit. Instead, a manual review section at the end displayed totals as calculated in the Profile and encouraged users to verify their accuracy. Our intention was to let users take control of the accuracy of the data, since it’s their data that represents their organization.
However, as we reviewed data from the new system, we learned that some users would complete their data entry, run a Funder Report, and only then notice issues with their data. They would then have to take time to go back and fix their Cultural Data Profile. In the worst cases, if not double-checked by the user, incorrect data could become part of the final submission.
In the end, we determined that we had an important role to play in providing the structure and tools that help organizations present their data accurately. In response, we’re adding some questions back into the platform that will run against the error check, and we'll enhance our verification features. Look for these changes by the end of November.
The new procedure will look like this:
As a final and very visible check on data integrity, we’ll also add a review that will run when the survey is complete. It will check for errors and warnings in all sections of the completed profile, and present a condensed list of errors and warnings to be addressed by the user.
We’ll host a training webinar for our users on the feature on November 20. Click here to register for the training webinar.
As always, we’ll be listening hard and collecting ongoing feedback to make sure that we’ve now gotten it right.