Find out more about how we benchmark social impact to help drive key performance results for your business. Got a question that isn’t answered here? Get in touch!
Who are 60 Decibels?
60 Decibels is a tech-powered impact measurement company that makes it easy to listen to the people who matter most. Using our Lean DataSM approach, we speak directly to customers, employees or beneficiaries, usually over mobile phones. We return high-quality data to you in weeks, with minimal hassle or distraction, to help you maximise your impact and grow your business.
How does a 60 Decibels project work?
Our main responsibilities are to:
- Understand your business context and impact thesis
- Translate your impact thesis into an agreed-upon set of metrics
- Write a Lean Data survey to gather these metrics, using our tried-and-tested questions developed over more than 700 projects. This survey is a mix of qualitative and quantitative questions.
- Receive customer contact details from you (typically mobile phone numbers)—a large enough number so we can pick an appropriate representative sample.
- (Once you’ve approved the survey) Deploy the survey using our team of 1,000+ trained in-country researchers in 80 countries, randomly calling from the list of numbers you’ve provided
- Clean and code all of the collected data
- Analyse the collected data and create a report that we deliver to you
Our goal, with this approach, is to make it as straightforward as possible to get the impact performance and customer feedback data that will help you serve your customers better, create great impact for them, and drive improved results for your business.
What makes 60 Decibels different?
Since 2014, we’ve been building our approach to impact measurement working with resource-constrained social businesses. Our first clients were in the developing world (mostly in South Asia and Sub-Saharan Africa) though now much of our work is in the U.S and Europe. We understand that you have a growing business to run. and we recognize the significant constraints on your time and resources. At the same time, we know that you’re deeply committed to making real and lasting change for the customers you serve.
Our approach is designed to get you the information you need to serve these customers better in ways that improve their lives. We do this by bringing deep domain expertise in social impact measurement; a multi-sector library of tested questions to understand social impact; offering an approach that is fast, nimble, and burden-free and that’s proven to work remotely (over mobile phones); and the world’s largest database of benchmarked, customer-level social impact gathered from more than 100,000 customers.
How much involvement is required from me?
We aim to make this as easy for you as possible. We will schedule one kickoff call with you discuss the process, answer your questions, and give us the chance to get to know you, your business, and the customers/beneficiaries you serve.
This call forms the foundation of our first survey draft. We will send you a copy of this survey to review. Once we’ve incorporated any feedback and have your sign-off, your next steps depend on the method we have chosen together. Most of the time, we conduct phone surveys, so we’ll need a list of your customers and their phone numbers. If we are using email, we’ll need their email addresses.
At this point, you’ll typically turn things over to us. In some cases, though (whether to increase response rates or simply to avoid surprises) you’ll choose to let your customers know in advance to expect our phone call. If you decide to do this, we’ll provide all the supporting information you need to make this as easy as possible (e.g. if you want to send the email, we will provide you with a link to the survey and a pre-drafted email so all you have to do is hit send!)
We’ll take it from there.
What do I get at the end of a project?
At the end of the project you will receive a results deck with the impact analysis as well as recommendations on how to best utilize this information, and the cleaned full dataset of customer responses that you can use for ongoing analysis. We will also provide the results to an optional staff quiz to engage your business with the impact results.
Who will be calling my customers?
We deploy our surveys using our team of local, trained 60 Decibels researchers. We have more than 1,000 researchers in 80+ countries, all of whom speak to customers in their local language. Most of these researchers have been with us for 2-3 years, though a few have been working with us for 8 years, and many have experience with both in-person and phone-based research. We keep KPIs on all researchers’ performance, and they are rewarded with bonuses for quality data collection (see ensuring quality section below).
These researchers undergo extensive training with us. We select and train them to listen critically, create rapport, be able to adapt and adjust to a situation, and facilitate the best interviews possible. This is because, in addition to placing strong emphasis on quality and integrity of data, we also put respondent experience at the forefront of everything we do. All researchers have signed confidentiality and non-disclosure agreements with 60 Decibels.
Our current base of researchers speak the following 171 languages: Acholi, Afar Af, Afrikaans, Akan, Alur, American Sign Language (ASL), Amharic, Arabic, Asante, Assamese, Ateso, Awadhi, Bahasa Indonesia, Balochi, Bambara, Bangla (Bangladesh), Bantu, Bariba, Bemba, Bengali (India), Bosnian, Cantonese, Cebuano/Bisaya, Chagga, Changana, Chewa/Chichewa/Nyanja, Chittagong, Colloqua, Czech, Dendi, Dioula, Ebiram, Embu/Kiembu, English, Ewe, Fante, Filipino, Filipino Sign Language, Fon, Fongbe, French, Fullah, Ga, Gamogna, Garhwali, Georgian, German, Giriama, Gofagna, Guaraní, Gujarati, Haitian Creole, Hausa, Hindi, Igbo, Ihukonzo/Rukonzo, Iku/Arhuaco, Indian Sign Language (ISL), Japanese, Javanese, K’iche’, Kabye, Kamba, Kannada, Kaqchikel/Cakchiquel, Kenyan Sign Language, Khmer, Kikuyu, Kimeru, Kinyarwanda, Kipsigis, Kisii/Gusii, Kono, Kotokoli, Krio, Kuria, Kyrgyz, Lamba, Lambya/Rambia, Lang’o, Lao, Liberian English, Limba, Lingala, Lozi, Luganda, Lugbara, Lugisu/Masaba, Lugwere/Gwere, Luhya, Lunyakole/Runyakole/Nkore, Luo/Dholuo, Lusoga/Soga, Maasai, Madi, Malagasy, Malay/Bahasa Malaysia, Malayalam, Mampruli, Mandarin, Manipuri, Marathi, Mende, Mina, Moore, Myanmar/Burmese, Nago, Ndebele, Nepali, Norwegian, Nupe, Nyiramba, Oriya/Odia, Oromo/Oromiffa/Afan, Pakistan Sign Language, Pashto, Peulh, Pidgin, Portuguese, Pular, Punjabi, Q’eqchi’, Ronga, Rukiga, Runyakitara, Runyoro, Russian, Rutooro, Sena, Sepedi (Northern Sotho), Serere, Sesotho (Southern Sotho), Shona, Sidamigna, Sindhi, Sinhala, Sisaale, Siswati, Somali, Spanish/Español, Sundanese, Susu, Swahili/Kiswahili, Tagalog, Tajik, Tamang, Tamil, Telugu, Temne/Themne, Thai, Tigrinya/Tigrigna, Tonga/Chitonga, Tsonga, Tswana/Setswana, Tumbuka, Turkana, Turkish, Twi, Urdu, Uzbek, Venda, Venda, Vietnamese, Waray Waray, Wolayitigna, Wolaytta, Wolof, Xhosa, Yalunka, Yoruba, Zulu
Note: If we do not currently have researchers who meet your language needs, we will use our large network of pre-existing resources or locally-based partners to find and train a team that meet your needs.
Why do you need information about my business?
In order to understand your customers and how your business is changing their lives, we need to understand some basic information about how and why they use your products/services. This informs our research design and gives us context on what questions to ask and what analysis will be most useful to you.
It’s important to note that our surveys intentionally blend “pure impact” questions with more traditional customer insights to drive maximum value and relevance for you. Our more “mainstream” questions include Net Promoter Score®, Customer Effort Score, and more general feedback about the product or service and after-sales support.
We have found that this mix of questions makes it most likely that we will gather data that will be most relevant to the business decisions you have to make every day.
Why are you asking for my customer contact details – what specific information do you need to do your work?
We need this information to contact your customers. 60 Decibels only speaks to participants who have provided their consent to be contacted. Typically we will ask you to provide us with:
- Respondent Name
- Phone number (where phone & SMS surveys are conducted)
- Email (where email surveys are conducted)
Where possible, we also ask for additional non-sensitive personal data, such as product purchased or approximate geography (city or state). This enables us to ensure a representative sample and conduct additional segmentation analysis, where required.
What access does 60 Decibels have to the data that’s collected?
Select members of the project team have full access to the data while it is being collected and analysed. Once collected and cleaned, your data will be analyzed to produce your results deck. You will also receive the cleaned raw dataset at the end of the project.
No one besides the 60 Decibels team and your company will see customer personally identifiable information (PII). When sharing raw data with investors/clients who are not the company, personal contact information (name, contact number) will be removed. In cases where legal standards require anonymity, we can collect the data in such a way that is compliant with these standards.
Once the project is completed, we strip all personally identifiable information (PII) from the data set and include the metrics in our anonymized and aggregated database for benchmarking purposes. Our management of PII, as well as what data we can share in what form, is governed by our contract and confidentiality agreement with you.
Will the individual customer responses be anonymous?
At the end of each survey, we ask respondents if they agree to have their name and information shared. In these cases, this information is identifiable. Otherwise, we anonymize it.
Will my investor (or anyone else) see the results of the project?
Results will be shared if and only if the company has given permission. When permission is granted, we often provide company and portfolio-level reports to investors. If permission is not granted, we can provide anonymized versions of these reports for investors. We will not externally share your results in an identifiable fashion without your prior written permission.
How do I know my data will be safe?
60 Decibels maintains and executes a robust set of policies and internal controls that ensure that we meet the highest standards for data collection, handling and storage and that we comply with the law. 60 Decibels also engages experts at regular intervals to audit our internal processes to ensure that all policies and procedures are compliant and meet certain data protection standards.
60 Decibels has taken the necessary measures to ensure all data collection, transmission, storage, and access processes are GDPR compliant. This includes:
- Responsibility: Defined roles exist within the organization to ensure a system of checks and balances in data access and use across the organization.
- Use: Use of the data is restricted to the terms of the contracts and in compliance with the law. 60 Decibels employees and contractors are required to comply with detailed protocols around data security and use, and periodic training ensures adherence and clarity on the protocols
- Storage: Data is stored in 60 Decibels’ restricted-access, cloud-based document storage system upon receipt. Data is then pseudonymized in 60 Decibels’ database (securely in AWS) to be used for insights, analysis and aggregate benchmarking.
- Access: Access to data at each point of the project cycle is restricted to 60 Decibels team members and those directly involved in the execution of the project. All team members with access rights to the data have signed non-disclosure agreements and are subject to the company data security protocols.
- Contracting: 60 Decibels uses contracts and non-disclosure agreements with clear language around data security with all clients, companies, and third-party contractors.
- Disclosure: 60 Decibels’ policies limit disclosure of data to statutory disclosure obligations, including Subject Access Requests and disclosure as required by enforcement agencies.
- Deliverables: Deliverables to clients are restricted to fully anonymized data and aggregated insights. No personally identifiable information is included in client deliverables.
- Retention: Upon completion of a project, original data is destroyed and pseudonymized responses to survey questions are retained in 60 Decibels’ database (securely in AWS).
- Third-party service providers: All third-party technologies and software providers that 60 Decibels engages are evaluated against the data privacy criteria outlined in 60 Decibels’ data policies.
- Ongoing compliance: 60 Decibels recognizes the evolving nature of data security compliance globally. As such, 60 Decibels regularly assesses policy compliance and makes needed revisions to policies and systems to ensure continued adherence to industry best practices and additional legal requirements.
How many customer contact details do I need to provide?
In a typical project, we will gather ~200-250 responses. In order to get a randomized, representative sample of customers, we typically request you share a database with a full list of your active customers. We select an appropriate, representative sample and aim to maximize the response rate of this sample. If you are uncomfortable sharing the full customer list with us, we request you share a database with 5-10 times the total number of respondents (i.e. 1,000 to 2,500 customers), randomly selected from your customer base.
What if I don’t have contact details for my customers?
If this is the case, we will work with you to find the best solution. This could include conducting a campaign to gather phone numbers (radio, lotteries, handing out business cards with SMS short-codes) or using an existing available database of customer contact details.
What if I don’t work directly with end beneficiaries (I’m a B2B business)?
In this case we’ll work with you to determine the best approach: either gathering data from your business customers, or working in partnership with them to find a way to reach their end (consumer) customers.
Can I add my own survey questions?
In order to provide you with comparable benchmarked data, the majority of the questions we use are ones we’ve developed and refined in multiple prior projects. The ensures the highest quality and comparability of results.
Once you’ve reviewed these, you may still want to add questions to the survey. The total number of survey questions you can add will be determined by the scope of the project and the overall length of the survey. In most cases we see companies adding a maximum of 3-5 additional questions to a survey.
Since our core expertise in in writing survey questions, we will work to understand what questions you want to ask and then draft those questions for you for your approval.
How do you decide to do an online survey versus a phone survey?
In most (70%) cases we conduct voice surveys over the phone. We find this gives us the highest quality quantitative and qualitative data. Where customers have good connectivity and access, and are used to hearing from you over email, we might instead deploy email surveys—we will make a specific recommendation to you at the outset of our project. We supplement these core approaches with SMS (both for data-gathering and to inform customers to expect a call) and, in exceptional cases, IVR.
When deciding which method to use, the key considerations are (i) the connectivity / devices your customers have; (ii) how your customers typically hear from you, (iii) the contact information you have, (iv) the type of product you are offering, and, most importantly (v) what is most valuable to you.
I don’t get legal language – have you got a simple 1 pager to summarize the Terms of Service?
Yes, you can find it here.
What if the benchmarking results have me performing poorly?
The 60 Decibels benchmark is just one of many indicators of your performance. It is meant to provide context for your results, ideally from comparable companies (with the same business model and in the same region).
We provide benchmarked results to be helpful to you, and wherever possible we will share recommendations based on what we have seen from the top performers in each sector or category. Our hope is that these results and recommendations provide you the tools to improve—and not a definitive statement now and forever!
What is your sampling strategy?
In general, we aim for randomized, representative samples of your total population of customers. If you request specific cuts of the data (by region, product, gender, etc.) then we will aim to get a random representative sample of each of the requested groups.
We will target a confidence level of more than 90% with a margin of error of 5%. To achieve this, we will typically interview ~300 respondents per project. If you require many cuts of the data (e.g. two products, by gender, by region) then we will typically have to increase our overall sample size to get sufficient representation of each your sub-groups of interest.
To note, while the standard for academic research is typically 95% confidence level we believe that a 90% confidence level is typically high enough for people running and supporting businesses and NGOs.
How do you ensure responses are representative?
By randomizing who we call and sampling correctly, we set up a foundation for creating a representative sample of your population. However, even with this strategy there is risk that the respondents will not be representative of your customer population. Unrepresentativeness can be caused by systematic bias in who responds to our survey. The two biases that we worry the most about are: a systematic variation (particularly of wealth, gender) between the ownership of mobile phones and the beneficiary population; and self-selection bias (a systematic difference between those who do and do not respond to the survey).
Regarding ownership and penetration of mobile phones, we’ve done considerable testing of in-person versus mobile surveys and in most cases have not found a systematic bias. However we do still aim to correct for potential gender bias by over-sampling women to ensure that respondents match the gender profile of the customer base. We also run checks for bias by using data provided on beneficiaries (i.e. geographical location, gender, payment status) from the company and looking for any underrepresentation in customers interviewed. In general, we have found with regards to working with the private sector that where customers are buying goods and services from social enterprises they typically have already bought a mobile phone. The exception can be in countries/regions with unusually low mobile penetration rates.
Regarding self-selection bias, we worry about that the most when we see low response rates. We’ve done considerable work to increase response rates, and they are typically very high—at least in the 30-50% range, and as high as 65%+ for our work in Africa, for voice surveys. We see lower response rates for SMS surveys and for voice surveys in developed markets like the US, where our response rates are typically in the 10-20% range. Additionally, we draw considerable confidence from the data itself: we see a healthy spread of responses, and not only extremes.
How do you ensure a high response rate?
We believe that by setting the right tone, informing customers, and giving them control over the process, that we can create a positive interaction that customers are happy to be a part of. Typically, we will notify customers in advance of the planned call, usually with a primer SMS to let them know that someone may be calling, and to tell them that we’ve been authorised by the company but are not from the company. We also use this message to explain the purpose of our call. Following this, we call each respondent up to three times, and allow them to pause the interview and continue it later if this is convenient for them.
How do you calculate your benchmark?
Our benchmark database now includes data from over 270,000 customer interviews, across 70+ countries and multiple sectors. We created the benchmark to provide context for companies; to enable us and them to learn what works and what doesn’t, and ultimately to see what is possible. They are enabling a deeper understanding of impact and are setting the foundation for using metrics as a benchmark for social performance.
The benchmark is essentially the average response to a given question by the customers we’ve spoken to. Because we ask standardized questions, and because we know information about the company (industry, business model, geography) we can cut these data to provide accurate comparisons for you: what did customers of companies that are similar to yours typically say in response to this question?
The result is data that allow you to see your relative performance across standardized key performance indicators of both customer feedback and social impact performance.
What are the strengths and limitations of phone-based data collection?
We’ve developed our approach at 60 Decibels from the ground up about the strengths and limitations of remote survey methods.
We recognize that there are potential issues of selection bias, including under-representation of poorer respondents and women. We address these through our sampling methods, by taking steps to ensure that our final sample matches the population data we receive from the companies we work with. More broadly, the increasing availability of cellphones (91% of the world’s population are cell phone owners) lessen these concerns every year. We also take many steps to improve response rates, including sending SMS primers to respondents prior to calling them and attempt to call each respondent three times, on different days and at different times, to make sure they have an opportunity to answer the phone and participate in the survey.
In addition, we are aware that there are some broader, more generic concerns about remote surveys, including around whether accuracy of remote surveys might systematically be lower than in-person surveys. What we have found in practice, both in our R&D and in our work speaking to nearly 300,000 respondents around the world, is that accuracy for remote surveys is as strong or even stronger than for in-person surveys. We are able to achieve these high quality levels through thoughtful survey design and rigorous training.
For a bit more detail, we’ve tested for variations of in person and remote surveys. We found the differences tend to be negligible, especially with regards to inclusivity analysis. Check out our case study (link here) on in-person versus remote data collection.
We already survey our employees/customers. How is a Lean Data study different?
Great job engaging your end stakeholders – we love to see it!
Some of our clients who use 60 Decibels phone surveys in addition to their internal surveys have found that:
- Phone surveys provide richer insights through the open-ended, qualitative questions than responses gathered in online surveys.
- Having a third party collect and anonymize responses has led to more honest, transparent feedback about challenges and areas for improvement.
- The 60 Decibels benchmarks, which are included in how we present the results, provide valuable context and comparability for survey results.