Behind every Monetate product is a team of smart people tasked with turning their own, or someone else’s, ideas into a reality. Such is the case with last week’s launch of LivePredict, Monetate’s latest product that combines automated segment discovery with in-the-moment personalization, giving marketers the ability to instantly take action and seamlessly deliver the experiences that customers want.
The team responsible for the development of LivePredict includes four individuals who scoured hundreds of terabytes of data to create a solution that gives enterprise companies a way to unlock the power of big data.
The LivePredict development team is led by Rob McGinley. Not unlike many of Monetate’s engineers who are graduates of Philadelphia-area universities, Rob earned a degree in computer science from LaSalle University. His first job after graduating was with Quintiq, a supply chain planning and optimization solutions provider. He worked on advanced planning and optimization software, which allowed him to cut his teeth processing data and searching for optimal business solutions.
For LivePredict, Rob was surrounded by developers with a diverse skill set, including: Jeff Patti, a former research scientist with Lockheed Martin and who also works on the Ecommerce Quarterly research; statistician and Python whiz Pat O’Brien; and Kris Molendyke, who has prior experience working at Siemens and the Department of Defense.
I recently sat down with Rob, and Tom Janofsky, Monetate’s vice president of engineering, to get a behind-the-scenes look at what went into the creation of LivePredict.
Q: Who is the ideal user of LivePredict?
TJ: I think it’s both the marketer and the analyst. For the analyst―who up until now had to pore through data, run multiple variations of segmented reports on whatever infrastructure is available, accumulate all of the results, rank them, score them, then take that information to someone who can do something about it, and see if they can act on it―LivePredict can do that for you every day, keep it up to date, and automate all of that work for you.
It will help analysts determine a course of action faster and think more about their customer. For instance, it could move you from spending hours upon hours figuring out that your website is underperforming for visitors from the Midwest, to talking to people about how to improve your offering in the Midwest. You can literally go from taking eight hours to get to that first question to spending two minutes to get to it, and then have eight hours to discuss what to do about it.
Q: What was one of the biggest challenges your team faced in creating LivePredict?
RM: What I think was the most interesting was the idea that whatever challenge one enterprise business faces in terms of data volume, we had to deal with an order of magnitude more data. We had to try a number of approaches to a data processing framework that could handle both the amount of data we need to process and the types of modeling that we need to support.
We ended up being able to process such large data sets using MapReduce on Hadoop, the distributed data processing framework used by our engineers. We also leveraged a new version of MapReduce and our knowledge of the structure of our data to vastly improve runtime. The end result is a product that can more than handle the load and is immediately scalable.
Q: What was the biggest aha moment during product development?
RM: When we really started to look at the data, we realized how important it was to come up with some kind of business or economic impact for our customers. It wasn’t just about raw values. What we are searching for are opportunities.
TJ: I agree with Rob. While it’s interesting to look for patterns, it is extremely more valuable to look for opportunities. A lot of our algorithms now are about detecting opportunities and not just patterns.
Q: A lot of people are talking not only about what LivePredict can tell them about their customers, but how that information is presented within the user interface. How important was that to the team?
RM: We focused a lot on user experience, which really is the intersection of the product because processing the data in a usable form only gets it to a certain point. It then must be presented in an accessible way that highlights what’s important while making everything actionable within the same interface.
TJ: On the visualization side, Kris [Molendyke] realized that we had stretched our existing tools as far as we could so he introduced D3.js, a widely used open source library written primarily by Mike Bostock, currently at NYTimes.com, and used on sites such as Nate Silver’s FiveThirtyEight Blog. The more traditional graphing and charting tools and mechanisms just weren’t sufficient for this project. We were looking to be responsive, rich, and multidimensional. I think we certainly got there.