Interview: The Beginnings of the SuperStrong Assessment

The Beginnings of the SuperStrong Assessment: Interviewed by Karen Gonzalez, Product Marketing Manager at CPP, in June 2017.

Challenging the status quo is what we’re all about at CPP Innovation Labs. Craig Johnson, Director of Data Science, talks to us about how the SuperStrong assessment was developed.

Karen: How did you begin the journey of developing the SuperStrong assessment? 

Craig: In 2015, Chris Mackey, SVP of Innovation Labs, and I were looking for ways to show how personality assessments could drive measurable insights and action. Our strategy was to find organizations with questions we could address with our assessments.  Personality and interests will not solve everything. We looked for organizations who were willing to play in the sandbox to create new and interesting solutions.

K: Who did you speak with? Did you have certain criteria?

C: We wanted curious early adopters.  We spoke with early-stage startups, venture capital firms, large insurance companies, state departments of transportation, job boards, school districts, and career and technical schools.

K: What did you find after doing your market research?

C: Several things:

  1. There is a desire to identify people’s interests and connect them to job openings, educational and career pathways, and determine the job environment a person would best fit in to.
  2. Users and organizations both wanted a shorter assessment.
  3. Individuals and organizations both want actionable insights. Individuals want to see their results and know what paths to take or behaviors to change.  Organizations want to better understand their people, where to put them, what they are interested in, how to develop them, and ultimately how it affects the bottom line.
  4. Assessments tend to become a one-time event, where people take an assessment, get their results, and walk away. Users rarely look at their results over time and organizations don’t use the data. These one-time “events” tend to be static text that requires interpretation with trained professionals. The organizations we talked with needed solutions that were at least initially self-interpretable so counselors and trainers weren’t a bottleneck.

In the end, we decided to start with the Strong Interest Inventory® assessment to address interests. We set about finding a way to shorten the assessment, make the assessment results more actionable for individuals and organizations, and find ways to make the assessment more accessible.

K: Shortly after, you began doing some testing with an unreleased 60-item Strong you’d rediscovered that focused on Holland codes. What did you do and what were the results?

C: We developed an API and partnered with a very large job board for college graduates. Users were provided the opportunity to complete that assessment. We then tied their interests to real-world jobs (interactive, results changed based on available jobs, actionable insights!).

K: But you didn’t stop there…

C: Nope! Although the first interaction was a success, there were still things we wanted to change and improve.  We set about conducting a series of experiments with over ~400,000 archival users of the Strong Interest Inventory assessment to develop new algorithms.  We validated the assessment with the archival data, conducted experiments to make sure users felt their results were accurate and replicated the results.  The quantitative data showed that the results were accurate, stable, and valid.  User feedback showed they felt the results were accurate and helpful.

K: Since the SuperStrong was initially rolled out, there have been a number of improvements and features added. Can you speak to some of those?

C: We continued to ask several questions which have led to each new version of the assessment:

  1. We thought, “wouldn’t it be cool if we could tie individuals results directly to information about jobs from O*NET™?” O*NET is full of information that our users would find helpful. We set about connecting individual results to O*NET for a more complete experience.  Along the way, we started to pull in other data sources.
  2. We also thought, “wouldn’t it be helpful to see job outlooks?” We decided to pull in data from the Bureau for Labor Statistics.
  3. Lastly, we explored, “what about if we could provide information on how their interests map to actual colleges?” We pulled in the Department of Education’s Integrated Postsecondary Education Data system.

K: A lot has been accomplished in such a short timeframe! What have you been hearing from users?

C: We were thrilled and motivated at the response we were getting from high schools, community colleges, technical schools, and career colleges.  Again, we didn’t know quite where we would end up so we release and iterate improvements quickly.  As an example, we received feedback about on one of the original assessment versions so we went through a revision using items that focused more about what people “do” rather than say just job titles.  We also received feedback for Spanish translation which we’ve built and released.

K: So what’s next for the SuperStrong assessment?

C: We’ve continued to add capabilities like aggregated reports for classrooms, the ability to save and compare jobs, clients can add custom questions, better integration with governmental data, and other things we aren’t quite ready to share yet. All of these improvements have come directly from users asking us to help solve a problem.

Craig’s final thoughts…

Since the inception, our team has grown, technology changed, but our core tenements remain the same:

  1. Look for opportunities to use personality and interests to answer questions and drive results.
  2. The assessment process shouldn’t be an event. We seek to find ways to make the assessment results applicable over time and to reuse the data for individuals and organizations.
  3. Find ways to make the experience interactive so people can explore rather than be dictated too.
  4. Make the results something people can actually use. Those actions could be being applying for a job, identifying a school, finding a hobby, or identifying a career path within an organization.
  5. Constantly look for interactions that allow us to play in the sandbox to find new and innovative solutions.

We are constantly striving to get better.  We couldn’t have guessed where we would be today nor where we will be tomorrow and we’ve built our VitaNavis platform (which is powered by the SuperStrong assessment) around that reality.  We are very excited that the technology stack we are putting together will be flexible enough to help answer your custom questions.  We have yet to scratch the surface of where we want to go!

Click here for more tips to help your students with career exploration, education resources, and industry news.

Interview: The Beginnings of the SuperStrong Assessment

Like What You Read? Share It With Others!