If you are a supply chain professional you probably already know this: Building analytics solutions that produce actual value is usually a very complex endeavor which requires a lot of thought, time and effort.
The main reason is that the data you need to consolidate to get value is typically very fragmented and lives in multiple data sources, both internally, externally (at suppliers/customers) and in between (e.g. forwarders, agents, etc.). Some data might not even be possible to access.
How do you build a complete supply chain analytics solution that lets you access, integrate and analyze all this data at the same time? Based on my 12 years of experience in the Analytics world, I would say you probably shouldn’t try to do that…
Going for a “big bang” complete solution almost always fails, not only in supply chain projects. In most industries it is just too complex to get everything right at the same time. Even if you are very detailed and careful, your project will be so long that during its course things around you will change that will get you into new problems, key people might leave, organizations change, etc. The list of things that can happen along the way is long.
So that begs the question: How should you approach supply chain analytics to ensure you create actual value for your own organization and indirectly for your customers?
Below are my top 5 recommendations based on my own work in the field of Business Intelligence and Analytics:
Instead of trying to solve all your analytics need with one big solution, approach your needs one at a time in-mini solutions and integrate/connect them later if needed to create additional value.
Start by mapping out your needs using short functional requirements statements along with at least one associated value driver. Having trouble coming up with what type of value a requirement drives? Well, then it just might be that this one particular requirement does not in fact drive value and hence should be removed from the least. Talk to all parts of your business in a structured manner to make sure you capture as many and diverse needs as possible. Once you have your needs mapped out, start to prioritize. Look at the value drivers, which ones of them are you able to quantify? Consider the feasibility of your requirements as well. If a need is high value but is very complex and requires a ton of effort to realize in a solution, move it down the list. Start delivering value by focusing on needs with high value AND high feasibility. Get the first solutions done quickly, put them in the hands of the users and create enthusiasm and trust before you get into the needs that are lower on the list.
Once solutions have been deployed it is quite common to see organizations act like “the problem is solved”. Users now have access to the data along with appropriate analytical capabilities to explore and make sense of it. Should be the end of the story, right? Wrong. What if the users are not using the solution or do not understand how to use it? Then you have spent a lot of effort to create no value at all. Make sure you do not fall into that trap by continuously monitoring user adoption. If users are using the solution it most likely creates value for them. Also, go back to your list of needs and value drivers. Is the solution delivering value in the way you thought it would before you built it?
Is it quantifiable? Then consider calculating the ROI of your solution. Having documented ROI can be a great boost and an accelerator for your future projects.
A few years ago, there was a lot of focus from companies consolidate all analytics in one or a few tools or platforms built specifically for Business Intelligence and Analytics. Today, when many applications and systems come with built-in analytical capabilities this obsession to put everything in one place for the sake of it seems to have cooled off substantially.Timber Exchange is a good example of a platform that not only creates and holds data but also lets users analyze the data freely. Thanks to the platform’s marketplace characteristics Timber Exchange also manages to consolidate data from both buyer and seller automatically. There would not be any additional value in extracting the data and analyzing it somewhere else. However, if you would be interested in analyzing data from Timber Exchange combined with some other data, e.g. from your ERP, then it is a different story. Then it would make sense to use a Analytics/BI tool such as Qlik, Tableau or Power BI to make that happen.
Focus on your needs, look at what you already have in place – do not overcomplicate - use the right tool for the right job.
Data literacy is the ability to read, work with, analyze, and argue with data. In my experience, data literacy is often taken for granted when rolling out a BI/Analytics solution to a group of users. Several studies have shown that far from all have adequate data literacy to derive value to full extent from a typical solution. Therefore, it is important to train users, not only to use the actual technology you have built the solution with, but also and perhaps more importantly to train them to interpret the charts and figures the right way. Without sufficient and appropriate training, a user can hardly be expected to get insights on their own and get the full value from your solution. Again, do not overcomplicate – a solution built right should not require a Doctors degree from users. Make it simple and fun and explain through examples from their normal day at the job.
When you work with numbers it is tempting to be 100% right 100% of the time. This becomes a challenge sometimes when maybe the data quality is poor or when we must wait forever for “this extra little something” before we can calculate something with 100% guaranteed accuracy. A truly successful organization operate at high speed and should not let minor details slow it down. What I am saying here is not to strive for 100% accuracy, but that almost correct is sometimes good enough. You should never be sloppy with details, but in reality, a lot of the patterns in the data will still be there even if you get a decimal wrong here or there for some reason. The same is true when you might have e.g. duplicate customers because of bad master data (for example, “Volvo” and “Volvo AB” become two different customers in the data). Usually there are ways around this in whatever tool you are using. That is the fast way towards value, not starting a big “Customer Data Cleanup Project” before you can launch your BI/Analytics solution. It will slow you down considerably with a very low addition of extra value weeks down the line.
When the use case does not demand 100% accuracy on all fronts to create value for users, consider making speed and agility a priority. You can let users know a certain figure can be +/- 1% and the solution will still be at 100% value. You can teach them to search for “Volvo*” to see all orders shipped to Volvo (until you have corrected it in the background). Don’t let tiny details slow down or stop entire projects - go live and get them out there, do your fixes and corrections in the background and make users happy and productive – with speed!
/ Andréas Pettersson-Koch, Director Business Value and Technical Sales
Andréas joined Timber Exchange recently after 12 years in the BI/Analytics space including more than a decade at Qlik. At Timber Exchange, Andréas works with our customers, partners and prospects to ensure they maximize the value from their use of the Timber Exchange platform.
Are you curious to see the analytical capabilities of Timber Exchange (and the rest of our functionalities of course)? Do not hesitate to reach out to us to schedule a live demo!