Multivariate analytics

Is Good Enough, Good Enough?

Using Existing Data and People to Deliver Higher Quality, Yield and Throughput

January 14, 2020
In the recent blog post “Marrying Digital Maturity Efforts with Workforce Maturity Challenges,” I mentioned the challenges attendees vocalized at the 2019 International Maintenance Conference in early December. Organizations struggle to best use the expertise and immense amount of data at hand to deliver not only the best output but sustainable knowledge transfer: how to figure out the optimal proverbial recipe that best uses the ingredients and chefs available.

Taking this analogy a step further, I’ll first recognize that not everyone likes to bake – but I do. Some fellow foodies many have stumbled upon the graphic that shows why your chocolate chip cookies might not be turning out the way you want. Baking soda vs baking powder, heat too high, not enough brown sugar, wrong type of flour, etc. So now you know what one (of likely many) key factor is, but what if you’re pressed for time and desperately want to make some cookies? Let’s face it: most of us would just run to the store but some might try to continue to tweak the recipe. What should you try first in your experimentation? This sounds like a whole lot of uneaten cookies and a whole lot of time to me.  In the business world, this translates to waste and cost.

Now, let’s parallel this to finding the key parameters impacting the final output of a manufactured product. There are different methods to find what is most critical to the process and the final product. Traditional data science approaches are sometimes too time consuming, so organizations veer to multivariable approaches. Multivariate analysis provides the most detailed answers more quickly, however, this approach requires a data science skill set -- often in short supply. Additionally, like the cookie baking cheat sheet, the results stop at identifying what might be wrong with your recipe, leaving process engineers wondering, “what do we try first?”  

Only Aspen ProMV Optimizer lets users test scenarios offline to ensure higher success rates in the first ‘live’ test varying key parameters. By indicating the desired output and setting thresholds for other known constraints, ProMV Optimizer provides a virtual roadmap to achieving the process output desired.  

ProMV was designed for use by those most familiar with the process. Data scientists play a key role in any organization, however, ProMV can further optimize their work by ensuring that analysis incorporates process knowledge. 

To hear more about how you can optimize quality, batch lengths and other key business drivers, join the webinar, Leverage History to Reduce Batch Cycle Time by 30% with Aspen ProMV.
 

There was a problem storing your subscription

Leave A Comment