Survey Launch offers custom development and programming services.
Select examples of our work, some of which have been featured in industry case studies and research awards.
A media research company conducting a small number of ad retention studies each month needed a solution to ramp up to 300 studies per month. A single study would typically test 25-100 independent elements. Each element had to be set up with multiple pieces of media and a number of attributes. Studies could also have custom questions.
Survey Launch engineered and built a custom application and interface for the client and consulted on the in-house work flow. Surveys move from section to section through a series of sign-offs, launching and closing automatically. On the back end, all data exchange is automated over secure web services.
Survey Launch administers the application and provides periodic updates and feature development.
At times the system exceeded 400 studies per month with a small staff of analysts.
A researcher conducting a package design study is testing respondent reaction to specific areas of a number of concept images. The researcher engaged the services of a commercial third-party click capture app. However, the methodology employed by the third-party click capture app was unsound and the resultant data flawed.
The researcher contacted Survey Launch and described the research goals and the urgency of the re-fielding schedule. Survey Launch was able to design and implement a more correct version of the click capture app in 24 hours, and at the same time also suggested a way to simultaneously capture additional information that might be of interest to the analysis.
During the short re-fielding period, Survey Launch designed and implemented an interactive deep-dive tool that exceeded the capabilities of the original commercial third-party tool, allowing the end client to dial in various views of the data and engage scatter plots and a heat map chart.
A media research company needed to test a number of media elements against a large sample group on a weekly basis for a syndicated product. New media elements would be added each week, then rotated back out after a certain time period had passed, based on attributes of the element.
Survey Launch worked with the research team to design and develop the survey instrument and supporting administrative applications including an automated back-end data exchange.
Survey Launch manages and administers the application which celebrated a 10 year anniversary in 2016.
A vacation property rental company with a strong online presence needed to test a series of features and pricing models using a large Sawtooth MBC model. The researcher wanted the exercise to closely imitate the company's ecommerce site to increase realism.
Survey Launch was able to closely match the exercise to the company's ecommerce site, including many dynamic elements and subrules employed on the real site.
The researcher found the approach to be effective and the resultant simulators proved very accurate over time.
A researcher needed to conduct a concept test with a large number of attributes. The list of attributes would be initially reduced using a battery of questions (i.e., a "build your own" exericse), and the resultant winners tested in combination using a conjoint exercise.
In order to achieve a realistic presentation, each combination of features would be represented by a unique set of images with several views. The total number of images that had to be created was 360,226. Manual image processing was unfeasible due to the sheer number of images that had to be composed.
Survey Launch created a dynamic image generation module that was able to compose images based on the active features and layout specs for a given combination.
This solution has been used on several projects that required dynamically composed images to achieve realistic presentation of features and concepts.
An insurance company conducts product testing on various sensing devices and protection systems. Respondents are recruited into multi-part studies with 4-6 segments conducted over a 4-12 month period.
Multi-part studies typically suffer from significant attrition, often greater than 50%. Survey Launch proposed the use of a "respondent portal" to increase respondent engagement and reduce attrition.
Each screened respondent received a link to a personal website at the start of the project. Their personalized page contained instructions, survey links, messages from the company, and a schedule of the project.
The respondent portal pulled information from survey records as well as a respondent sample database that is administered by the researcher through an administrative portal. Survey invites and periodic reminders all contain the same personal website link.
This technique has been used with great efficacy on a number of multi-part product tests and diary studies to increase participation and reduce attrition.
A research company conducting a food product test needed to quickly screen and schedule a large number of respondents into grouped sessions. Given the number of products that were to be tested, each group would test a subset of the products. Groups had to be carefully constructed to maintain demographic and preference profiles across all products. There were numerous screening criteria and a limited number of seats in each test session. Sessions were to be conducted at a national level using different central location test companies.
Survey Launch proposed and built a screening/scheduling application driven by the researcher's impressive spreadsheet of groupings and taste combinations. Facilities were able to screen and schedule respondents, reschedule as needed, and activate and monitor progress of respondents in the self-administered taste test, all through the custom application.
The application was created in 48 hours.
A telecom needed to test a complex series of bundling options using a Sawtooth MBC. The exercise was to mimic the ordering process on the telecom's website, both in function and appearance. Each scenario was to be shown across several screens, similar to how the respondent would order services on the telecom site.
Scenarios were driven by the researcher's MBC design and adaptive to respondent input. Further logic was imposed by a complex series of business rules from the telecom.
A research company conducting a survey needed a way to capture open-end responses including screen shots, images, and free form text.
Survey Launch incorporated a full-fledged document editor into the survey that allowed drag-and-drop and copy/paste of images and screen captures, as well as free form text entry and word processing capabilities.
A custom report tool was created to present the responses exactly as the respondent had entered them.
This is a historical case study from 2003.
A travel magazine conducting an annual hotel and resort study on paper and pencil contacts approximately 250,000 readers to rate destinations on various attributes. Survey Launch was contracted by a research company to help convert the project to online format and increase the number of rated attributes to over 40,000.
Survey Launch proposed a survey design that organized the property attributes in a hierarchy, allowing the respondent to quickly step through the survey and rate only the relevant attributes.
A custom program was created to generate the survey directly from the magazine's working database, allowing the magazine to continue to edit the property list until the day of launch.
Survey Launch also created and engaged a series of algorithms to sniff out repeat voters and ballot-stuffing efforts. Today these techniques are in common use and known as "computer fingerprinting".