Data Systems

For problems of extreme complexity and high level of customization we develop and manage custom enterprise-wide systems. Data Systems are our Special Ops — long-term projects of several months to several years requiring undivided attention from teams on both sides. The benefits, of course, are proportional to the efforts. Because they address the most valuable challenges for an enterprise and solve them in a systematic way, our entrerprise-wide systems become profitable in the first year, bring multiples in ROI, and generate value far beyond the original scope of a project.

Imagine a food and flavor company willing to shorten the product development cycle and reduce time needed to match the composition of flavors and fragrances to what the consumer wants. Acquiring efficient flavor composition equipment is not enough. The speed of analyzing the consumer preference data should be comparable with the speed of producing the data, but sensory evaluation science is a a hard nut to crack — people like different things, can be driven by the same ingredient in different directions, have different internal scales of rating their liking, and cannot evaluate more than a few dozen of flavors at a time.

Discovering what drives consumer liking and how to design products that many people like a lot requires designing custom analysis workflows, hiding all the analysis complexity under the hood, focusing on interactive visualizations of data and insights, so that chemometricians and flavor designers can be most effective. However, even that is not enough to reap the maximal product design advantage. The models, which connect product composition to liking can be optimized to suggest best compositions of flavors that would cause maximal liking in the target population. This requires another set of workflows which would automatically suggest compositions of interest targeting the designers goals — one might want to design flavors that many people like a lot, or a segment likes extremely, or the ones that give highest average liking and lowest deviations, or highest liking and highest profitability. Or we can go further and integrate the developed analysis and optimization workflows with the data collection process, such that the discovered promising flavor combinations smoothly get composed and evaluated to bring the next level insights, improve the models, increase focus and understanding.

Now imagine a chemical plant, operating smoothly for 30-50 years, with lots of sensors of various complexity. Some are easy - like temperatures and flows. Some are quite difficult and require lots of maintenance and expertise, e.g., assessing a chemical composition of a stream every so many hours. In an ideal world all sensors would work flawlessly and give you noise-free data every few seconds. In reality sensors fail (sometimes often) - in some cases no data is produced, but in the others - wrong data is produced. This requires continuous attention and effort to keep the system operational. Monitoring systems that constantly track and assess the consistency of the data is a time saving component and allows a targeted maintenance effort. Maintenance will become a proactive activity, rather than a reactive one.

Another issue that we frequently encounter is that plants that have been around for many years have been changed - pieces have been added or removed, different pieces of equipment purchased and installed at different times, sensors have been added or removed, capacities of reactors have been increased, and the original designers of the plant may no longer be around. In these cases there is a huge benefit in reassessing all the relationships between the data streams and a thorough analysis of what influences what, resulting in an interaction map. Clear interaction maps help you to re-optimize plant operations in the context of everything else (also including potential upstream or downstream plants), assessing the value and the optimal positions of new sensors, and improve profitability while having a documented and current insight on the plant operations.

In many cases it is also possible to replace difficult sensors which generate data at low frequency by models that map the high-frequency data from cheap sensors to the target sensor output. These models are called soft sensors (soft - as in software). Implementing soft sensors on a plant gives several advantages. You can lower the frequency of difficult measurements, e.g. a chemical composition analysis in the lab from six times a day to once a day. You get a new sensor that produces the same kind of data in real time, and the end result is that you can optimize your plant much better. You can create back-up scenarios of sensors that replace other sensors whenever they fail or need to be maintained. Check soft-sensors in our gallery of case studies.

Another critical component in many manufacturing companies is scheduling and logistics. Goods have to be produced in different varieties and quantities and shipped to different customers at different times. Ideally every customer gets what she ordered at a time she wants and on spec. In reality the goods have to be matched to the production process such that the feedstock is optimally used, expensive equipment is evenly occupied to avoid idling, business and operating plan goals are achieved, shipments are regularly scheduled, etc. Aligning operations with the customer needs requires a serious transformation of the production process, with ripples of changes reaching sales, inventory, planning, and logistics departments. We are currently working on a project to design an enterprise-wide production planning system for worlds largest birch plywood manufacturer Sveza and are extremely excited about the results (READ MORE about Sveza OptiPlan and OptiStock).

Call us