FAR Con Follow-Up

FAR Con Follow-Up

Axiom’s Rob Beachy, a nationally recognized speaker, leader and writer in the fields of innovation and management, recently attended the FAR Con Financial and Retail Conference on Analytics in Minneapolis. Here is his take on big data:

We really enjoyed the FAR Con conference sponsored by MinneAnalytics. There was a diversity of approaches, problems, and solutions for most every challenge in today’s world of big data.

Not surprising, there are more new vendors with tools for managing and deciphering big data and many new open source programs as well.

Familiar programs, state of the art improvements

SPSS, SAS, STATA (a favorite of my students) and R were the most used amongst the participants, but many who own and use SPAA and SAS were unaware of the TIME SERIES modeling capability of these traditional tools which have done an excellent job growing with the need for more in-depth business intelligence for forecasting, trend analysis, and testing marketing programs.

We were gratified to receive emails from participants who were enlightened to understand how we use government resources to solve complex big data problems using Census, BLS, and other sources for analysis and forecasting.

It’s all in the math

The conference was also an opportune time to openly discuss challenges, approaches, and solutions from a variety of different perspectives.

Regardless of whether a company is a service or product company, consumer or commercial, B2B or retail, the same equations apply.

A word to the wise

Choosing the “right data” is still the key to all analytics. You can program a time series problem or use a canned program and still get incorrect results.

We always recommend a “sanity” check using previous data or government data to ensure reality.

When all else fails, go observe the data in real time

At FARCON, we gave examples of how data analysis, with the most sophisticated tools, including replicated and verified data, could not provide the answer to solve a serious health issue in a factory. Simple observation over a few hours provided a solution data could not. Others shared retail scenarios where observing the purchase process discovered a disconnect with package color changed just slightly and in another case, the area where the products were placed changes purchase behavior.

The three keys to using BIG DATA are:

  1. Utilizing the key variables, or when all else fails, testing the variables in infinite combinations to ascertain a correlation.
  2. Benchmarking the data against know norms, analyzing weather, consumer spending, micro and macro economic trends, and consumer/commercial spending.
  3. Having the people who understand statistical analysis, its implication, and limitations.

Remember, you have to speak their language!

There are reams of “free” government data and the people in the government who are “paid” by your tax dollars to help you solve BIG DATA issues if you speak their language.

You can always pay more, but the key is finding those people who walk the walk, talk the talk and understand the implications and limitations of statistical analysis.

Now find the people in your organization who speak the language of advanced statistical and business intelligence or find a partner to help you use BIG DATA to increase profit, solve problems, or forecast the future.

As always, if you’re interested in learning more about big data, contact Rob at rbeachy@axiomcom.com.

1 Comment
  • Kalam
    Posted at 00:26h, 20 December Reply

    That’s really thinking at an imerpssive level

Post A Comment