Rulex https://www.rulex.ai The platform for smart data management Wed, 16 Apr 2025 10:58:45 +0000 en-US hourly 1 https://wordpress.org/?v=6.8 https://www.rulex.ai/wp-content/uploads/2024/05/cropped-favicon_rulex_white-32x32.png Rulex https://www.rulex.ai 32 32 Supply Chain Inventory Optimization: How to Avoid the Stockout and Overstock Nightmare https://www.rulex.ai/supply-chain-inventory-optimization/ Thu, 27 Mar 2025 09:10:00 +0000 https://www.rulex.ai/?p=250242

You’ve been eyeing that perfect shirt for weeks, only to find out it’s out of stock. Frustrating, right? But here’s the good news – thanks to inventory optimization, that same shirt is now back on the shelves, ready for you to snatch up.

In the fast-paced world of retail and supply chains, inventory optimization is the magic behind ensuring products are where customers want them, when they need them.

It goes without saying that stockouts and overstocking present a real nightmare for businesses, costing e-commerce companies an average of 11% of their annual revenue.

In this article, we’ll dive into how smart inventory solutions can prevent these dreaded issues, streamline operations, and deliver goods faster than ever.

What is supply chain inventory optimization?

Inventory optimization is essentially a gigantic scale of demand and supply that companies trading goods must constantly balance. It involves strategically managing stock levels to maximize efficiency, minimize costs, and ensure customer needs are met. The goal is to have the right inventory available – not too much to result in extra storage costs in warehouses, but enough to buffer against unexpected disruptions.

Considering the many factors that could tip this scale in one direction or the other, finding the perfect balance is a challenging task. But not an impossible one.

What are the key elements of inventory optimization?

Inventory optimization is a multifaceted process that involves various players and stages. However, it can be broken down into three core areas: demand forecasting, safety stock management, and logistics. In other words, to stay efficient, businesses must anticipate future sales, determine the right stock levels, and ensure seamless distribution of goods.

Demand forecasting

This is the process of estimating future demand for a product, based on data. Even if we can’t predict the exact future, businesses can get close by combining historical sales data, customer and financial analytics, and external factors like seasonality and economic conditions.

Safety stock management

Closely tied to a product’s future sales, safety stock is an optimal inventory buffer that prevents stockouts while minimizing excess inventory, ensuring the company meets customer demand. It also acts as a cushion against supplier and production delays, demand fluctuations due to seasonal or unexpected events, and constraints during frozen horizon periods.

Logistics

A company may have sufficient stock to meet demand but not have it in the right locations. This is where logistics come into play. A well-organized flow of goods moving from production lines to various warehouses and distribution centers ensures that stock levels are strategically distributed, making inventory more efficient and accessible where it is needed most. Easy to say but hard to do, considering how complex it is to orchestrate shipment scheduling while also maximizing loads and choosing the best routes to save costs.

Key elements of inventory optimization
Collaborative planning for Supply Chain optimization

The challenges of traditional inventory optimization

Current inventory levels, storage capabilities, seasonal trends, good longevity, future promotional campaigns, supplier lead times and schedules – many are the factors to consider to always have the right amount of products ready to fulfill the market’s crazy demands.

Plus, even if we could make almost perfect demand predictions, another big challenge arises: having fast turnaround times to immediately act on, depending on the changing conditions. All of these factors make inventory optimization a highly intricate and demanding task, especially in the face of:

Fast-moving consumer demand

In the era of Amazon’s next-day shipping, consumers expect rapid delivery times, making it challenging for companies to predict and respond to shifting demand patterns quickly.

Increased competition

With even more players in the market, companies must find ways to stay ahead by improving their inventory management to meet customer expectations while controlling costs.

Supply chain disruptions

Unpredictable disruptions can halt the flow of goods, leading to stockouts, delays, and the need for costly contingency plans.

Technological challenges

It may sound shocking, but 67.4% of supply chain managers use Microsoft Excel to manage inventory, demonstrating how many are still relying on outdated methods instead of leveraging more efficient, automated technologies.

Inventory optimization strategies

This is an overview of some of the many strategies companies can implement to avoid the stockout and overstock nightmare.

ABC analysis

The ABC of any inventory optimization strategy is, of course, knowing your product, and this is what this method is all about. It classifies products based on their significance to the business, grouping them into three categories: A, B, and C. Category A includes the most valuable and critical items, whereas Category C consists of the least significant ones. By using this system, businesses can better manage inventory, ensuring they maintain appropriate stock levels and focus on the products that provide the greatest value to both their customers and the company.

Rulex has a long history of helping major retailers and supply chains implement this very strategy. See how easy it’s to calculate the most profitable and best-selling products in Rulex Platform in this dedicated webinar: Improving business strategy with ABC segmentation – Webinars – Rulex Community

SKU rationalization

Once a company has identified its Category C products through ABC analysis, the next step is to determine whether these SKUs (Stock Keeping Units) should be discontinued, modified, or retained in the product lineup. For example, a company might choose to keep a specific SKU because, despite its modest sales, it’s a niche product which cannot be found elsewhere, and consequently drives customer loyalty.

To streamline inventory in specific product categories, many retailers use Rulex Platform’s powerful Assortment Optimizer tool. This tool extracts and generates replacement rules from frequent itemsets, helping businesses determine how to replace items with equivalent alternatives for maximum benefit. By strategically removing products and replacing them with these alternatives, Rulex Platform helps businesses optimize their assortment, minimize revenue loss, and prevent customers from switching to competitors.

Demand forecasting

We have already mentioned the importance of estimating demand to align supply with market needs. This is another strategy that major companies have implemented using Rulex Platform. For example, Rulex supported a global pharmaceutical company in accurately forecasting sales for several flagship products through an all-in-one solution, which encompassed data pre-processing, modeling, and forecasting within a single platform.

To learn more about demand forecasting and its real-life applications, read our dedicated article.

Safety stock tuning

Downstream demand forecasting is the Holy Grail of safety stock tuning. In simple terms, it’s about determining the optimal amount of inventory to maintain based on both certain and uncertain factors (e.g., confirmed orders, expected orders, item availability, ect.).

Rulex has developed a solution that does exactly that. Using a series of simulations, it analyzes whether and how often the planned stock level could be exceeded. For example, if the estimated safety stock is set at 100 apples, but 20% of simulations show demand surpassing this amount, we will face a 20% risk of stockout if we don’t set a higher safety stock.

Replenishment optimization

Once safety stock levels are set, they must be carefully monitored to ensure smooth operations. Replenishment optimization is the strategy that makes this possible by managing restocking at the right time and in the right quantities. It ensures that inventory is replenished efficiently to meet demand while minimizing costs associated with stockouts, overstocking, and storage.

To address restocking challenges, Rulex has designed the Network Optimizer. This tool balances the workload distributed over a network, determining the amount of material to move and where to move it. Keep reading to see how this works in a real-life scenario.

Transport optimization

Good logistics ensure that products are where they should be when they are most needed. To help companies achieve this goal, Rulex offers an extremely powerful Transport Optimizer. Known as, Rulex Axellerate, this generates shipment schedules that help keep costs and delays down.

With extremely rapid computation times, Rulex Axellerate can create end-to-end, long-term transportation plans in under an hour. When applied to a major logistics network, the solution reduced expedited shipments by 70%. Watch the video to see how Rulex Axellerate works.

In addition to these two vertical solutions, Rulex offers a proprietary, general-purpose solver capable of tackling virtually any optimization challenge – from work shift scheduling to dock optimization — even across multiple domains simultaneously, such as production and packaging.

Real-life success: how Rulex transformed inventory optimization

At Rulex, we’ve worked with some of the world’s largest production and supply chain leaders to revolutionize inventory management. One standout case is a Fortune 50 supply chain company seeking help to maintain optimal stock levels across multiple warehouses and sites.

Traditional planning systems left up to 20% of planning decisions to be handled manually – an inefficient, time-consuming process. Rulex stepped in with an advanced replenishment solution, automating over 90% of these previously unmanaged decisions. With its powerful optimization tasks, Rulex reduced underand overstocking by 8%, cutting costs by $100K per day in a single market region and lightening the planner workload by an impressive 75%.

By seamlessly adapting to real-world complexities, Rulex’s replenishment solution worked as a game-changer in creating a balanced, cost-effective, and agile supply chain.

Find out more about this specific case study and others in our e-book.

E-book Navigating complex problems with resilient solutions

The road to better inventory planning and optimization

If you sell products, knowing where your inventory is, who wants to buy it, and where it needs to go is essential to your business’s success. However, this knowledge alone may not be enough without the right, flexible solutions to act on it.

With a vast array of predictive techniques and optimization solvers, Rulex provides supply chains with a powerful advantage, enabling highly customizable solutions for smarter inventory management.

Avoid the stockout and overstock nightmare

Rulex Platform
]]>
Rulex Axellerate nonadult
The Supply Chain Balancing Act: Demand Planning and Forecasting in Action https://www.rulex.ai/demand-planning-and-forecasting-in-action/ Thu, 27 Mar 2025 08:00:00 +0000 https://www.rulex.ai/?p=250048

Demand planning is a both an art and a science – get it right, and your supply chain runs like clockwork. Get it wrong, and you’re left dealing with stockouts, surplus inventory, and unhappy customers.

Ever walked into a store and easily found what you needed? That’s demand planning done right. But achieving this level of efficiency is no easy task: it requires factoring in countless variables, from shifting consumer trends to supplier constraints, all while staying agile in the face of unexpected disruptions (looking at you, COVID-19).

What is demand planning?

Think of demand planning as a balancing strategy, where the goal is to accurately forecast customer demand so that the right products are available at in the right place, in the right quantity, at the right time. To get there, demand planning relies on sales trends, market shifts, and lead times to keep inventory levels optimal.
Sounds complicated? It is. Supply chains are deeply interconnected, and a single miscalculation at any stage can set off a domino effect that will leave businesses scrambling or customers empty-handed.

Demand planning for inventory management

Whether literal or figurative, every product has an expiration date. Effective demand planning is necessary for knowing how much inventory to keep to meet customer needs, without the financial drain of overstocking or the lost revenue from stockouts.
Poor inventory management can have a massive financial impact: according to the global consulting company McKinsey, companies that can predict customer behavior more accurately reduce inventory levels by 20%, leading to significant reductions in inventory costs [1,2].

The importance of forecast accuracy

The value of accurate demand planning extends far beyond inventory management. Forecasting is also a strategic tool for financial planning. Critical business decisions such as pricing strategies, revenue projections, and capacity planning are directly tied to the demand management process. A precise forecast helps businesses make data-driven decisions that foster growth.

Collaborative planning for Supply Chain optimization

Supply chain efficiency is a team effort, where even the best forecasts can fall short if there is poor alignment between departments.
To understand why the demand management process is not an isolated task, let’s break down the complex web of a supply chain. It all starts with suppliers delivering raw materials to manufacturing facilities, where products are produced. From there, finished goods are sent to distribution centers, awaiting shipment to retail stores or directly to customers. Unexpected disruptions at any step of the chain can throw the entire system off balance. Imagine a scenario where the sales team anticipates high demand, but procurement fails to secure enough raw materials. Or if production increases, but warehouses are already at full capacity. Even identifying which finished products drive the highest profits through an ABC analysis is crucial for knowing which goods to prioritize at the very first step of the chain.
So, how do businesses navigate these challenges and build a more resilient approach to demand forecasting? By following industry best practices.

Collaborative planning for Supply Chain optimization
Collaborative planning for Supply Chain optimization

Best practices for implementing demand planning

At its core, forecasting relies on two types of information: certain data, such as known warehouse stock, and uncertain factors, like fluctuating customer demand. Success depends on estimating these uncertainties and making informed decisions to maintain the right stock levels. Identifying an issue in your demand planning is not enough – if the software used to make predictions takes too long to adjust, your business risks missing sales opportunities and becoming trapped in operational inefficiencies.
Let’s break down the steps to mastering demand planning:

Collecting and preparing data

Strong demand planning starts with high-quality data. This involves collecting information from different sources and formats, integrating it, and cleansing it for analysis. The faster and more adaptable your system, the better.

Defining process models

Should your business prioritize aggressive stockpiling or a lean inventory approach? Determining the best strategy is rarely straightforward. The same applies to choosing the right forecasting model for each product, as not all products behave in the same way! For example, printer cartridges tend to follow a stable and predictable pattern, while high-end smartphones see sudden demand spikes and shorter life cycles.

Choosing the right software

Invest in demand planning software with advanced AI and ML algorithms, as well as other capabilities that enhance forecast accuracy, decision-making, and allow for quick adjustments. A robust platform should support real-time recalculations and generate multiple demand forecasts based on specific products, adjusting to different assumptions and probabilities.

Implementing and monitoring

Use automated workflows to test multiple forecasting models, selecting the most effective one, and check if it’s still grounded in reality by monitoring sales and new external variables. Even better if the solution can be integrated with other supply chain tools already in-use.

Demand planning software: what Rulex Platform can do

Rulex Platform is an all-in-one decision intelligence software that allows companies to seamlessly manage every aspect of demand forecasting – orchestrating data, improving and supporting decisions, and automating processes. All with the flexibility to easily integrate into the existing infrastructure, thanks to capabilities like APIs. Designed with a drag-and-drop interface, Rulex Platform allows non-technical users to mix and match the advanced technologies of its toolkit to create tailored solutions. As a plus, it also includes an integrated dashboarding and custom UI component, providing a smart way to visually interact with underlying data and solutions.
From mitigation sources planning, to replenishment optimization and simultaneous production and work shift planning, Rulex Platform is trusted by companies and supply chains worldwide in their digital transformation journey (you can explore some success stories in our Supply Chain case studies e-book). But first, let’s examine a practical example of demand planning in action.

Demand planning software: what Rulex Platform can do

A real-life demand planning example

Recently, a global pharmaceutical company turned to Rulex Platform to improve demand forecasting for several flagship products based on both historical sales and external variables such as temporal trends, the diffusion of seasonal viruses, and geographic information. Using AutoML, Rulex’s decision intelligence software automatically selected the best forecasting model, allowing for parallel testing of multiple strategies. The company’s domain experts could check what was going on and make adjustments at each point of the process, thanks to the inherent transparency of Rulex’s technology.
The solution delivered demand forecasts over a three-month forecasting horizon, updated dynamically to incorporate new data and ensure that the latest trends were always factored in. The results spoke for themselves: accurate forecasts (average error below 10%), smooth integration into inventory and logistics systems, and a process that helped the company make smarter decisions.

But what about demand forecasting for newly launched products with no sales history? In another real-world case, a furniture company used Rulex Platform’s advanced analytics and clustering tools to tackle this challenge. The software analyzed key attributes of new items, such as category, brand, and size, to find similarities with existing products that had sales data. By comparing these attributes, the company was able to generate an initial demand forecast, even in the absence of historical sales records.

Once you’ve perfected demand forecasting, the next step is inventory optimization – if that’s something you’re exploring, we’ve got an article on it too.

The future of demand planning

Companies investing in advanced forecasting solutions, such as AI-driven machine learning and predictive analytics, will stay ahead of market shifts, contributing to a global economic impact of $9.2 trillion by 2029 [3].

Is your demand planning strategy ready for what’s next? Get in touch today to find the right solution for your business.

Find the right supply-demand balance

Rulex Platform
]]>
The Data Governance-Security Connection: Protecting Your Business from the Inside Out https://www.rulex.ai/the-data-governance-security-connection-protecting-your-business-from-the-inside-out/ Thu, 21 Nov 2024 08:00:14 +0000 https://www.rulex.ai/?p=247481

In today’s digital economy, data is the driving force behind innovation, helping us make smarter decisions and stay ahead of the competition. Yet, data can easily be compromised, leading to costly financial setbacks and lasting reputational damage.

That’s why customers and partners are more cautions about who they trust with their sensitive information. For businesses, it’s no longer enough to simply handle data – they need to prove they can keep it safe. This shift has transformed data governance and data security from ‘nice-to-haves’ to essential pillars of business operations.

Average cost to mitigate a data breach, per incident

Data governance VS data security: what’s the difference, and why you need both?

Terms like data governance and data security are often used interchangeably, but while they work hand in hand, each plays a unique role in a company’s data strategy.

Data governance is all about setting up internal policies, procedures, and frameworks that define who owns the data, who can access it, and how it’s used. This keeps data organized, consistent, and reliable, helping businesses maintain control over their most important information and prevent employee misuse or accidental mishaps.

A strong data governance structure also opens the door to data democratization, giving users broader access to data while ensuring compliance with established standards. For instance, systems like data mesh break down master data into manageable units, which can be widely accessible to the people who need them, while modifications remain in the hands of a few who have full ownership.

Data security, on the other hand, focuses on protecting data from external threats, both when it’s stored and when it’s exchanged. Techniques such as encryption come into play here, scrambling sensitive information into unreadable code to safeguard it from unauthorized access.

But here’s the thing: without proper governance, data security is like locking your doors without knowing who holds the keys.

So, how can companies make sure their governance and security framework is up to par?

Data governance VS data security: what’s the difference, and why you need both?

The 6 pillars of data governance and security

There are software solutions that can provide companies with the highest standards of security and encryption at every stage of the data lifecycle, while fostering collaboration between teams.

Rulex Platform is designed to meet these critical needs, supporting the creation of a strong, reliable data governance and security framework through six main capabilities:

  1. Data catalogs. Rulex Platform supports data mesh systems, facilitating the creation of sub-catalogs within the user’s preferred cloud environment and enabling resources to be accessed via a shareable yet encrypted username and password.
  2. Clear data ownership. Rulex Platform provides granular control over roles and permissions, which can be assigned to individual users or groups. This means users can decide exactly who gets to see, execute, or delete specific resources, preventing any unauthorized or accidental mishaps.
  3. Policy monitoring. Rulex Platform automatically logs all access to and operations performed within the software. Modifications to the settings and preference files are also logged and encrypted to prevent external alteration.
  4. Event logs. Rulex Platform supports security information and event management (SIEM) technology for log event collection and real-time analysis of security alerts, simplifying compliance tracking and prioritizing incident response efforts.
  5. Advanced encryption. Rulex Platform uses the latest and most advanced TLS 1.3 protocol and AES-256-GCM encryption to protect data at rest and in motion. Users can choose to have encryption keys generated automatically or provide their own, giving the flexibility to match specific security needs.
  6. External vault systems integration. Rulex Platform securely stores sensitive configuration data within its environment or through external vault systems, allowing real-time retrieval of credentials and secrets without embedding them in source code, minimizing the risk of exposure.
Resource Permissions - Inherited Permissions

Granular role and permission management via Rulex Platform’s interface.

Rulex Platform is also ISO 27001 certified, aligning with globally recognized standards for information security management.

Build lasting customer trust

In an era where data breaches make headlines, implementing a strong data governance and security framework has become a business necessity.

Define custom roles and permissions, keeping an eye on all access and changes, and lock down your data with top-notch security and encryption. Request a 30-day Rulex Platform free trial now.

Securely manage your data down to the finest detail with Rulex Platform

Rulex Platform
]]>
What to Do with All This Data? Unlocking Actionable Insights with Analytics https://www.rulex.ai/what-to-do-with-all-this-data-unlocking-actionable-insights-with-analytics/ Fri, 08 Nov 2024 07:43:23 +0000 https://www.rulex.ai/?p=247377

The digitization of information, together with the proliferation of sensors and various monitoring systems, has dramatically increased the amount of data available to companies. Practically every business holds large amounts of data across all areas, from sales to production phases, and from marketing to logistics. When faced with such vast amounts of data, the the first question that comes to mind is: “What do we do with all this data now?”. Even if the question may sound a little naive, managing the complexity of all this data and figuring out what to do with it is a significant challenge for many businesses, regardless of their size. Of course, a decision maker will have no doubts about what they want to achieve: better decisions! But what does this mean? And how can data be converted into those outcomes?

In general, analytics refers to the process of transforming raw data into helpful insights that support decision-making. Analytics encompasses various techniques, including statistics, computer science, and domain-specific knowledge to extract actionable insights. The visualization of data and insights also plays a pivotal role since those who make decisions often do not have the time (or the technical skills) to explore all the details about the data and want to get right to the point: how is business going and how can I improve it?

What is Advanced Analytics?

Recent advancements in computer science have led to a new generation of analytics techniques that allow stakeholders to automatically uncover deeper patterns and make more accurate predictions. These techniques include machine learning, artificial intelligence (AI), natural language processing, and statistical modeling. Advanced analytics provides the means not only to understand what has happened but also to forecast future trends, optimize processes, and simulate various business scenarios.

For example, while traditional analytics might show that sales dropped in the last quarter, advanced analytics can use machine learning algorithms to predict sales for the upcoming quarter, identifying the key factors that might influence future outcomes. This capability is vital in industries such as finance, healthcare, and supply chain management, where predicting future trends (and the reasons behind them) can significantly impact decision-making.

Different analytics for different questions

Regardless of the techniques used to analyze data, decision-making requires answering a series of increasingly complex questions:

  • What happened?
  • Why did it happen?
  • What is going to happen?
  • What should I do?

Each question corresponds to a deeper level of insight into the data, allowing for progressively more informed decision-making. Accordingly, different levels of analytics can be identified, each aligned with these questions.

  • Descriptive analytics is the simplest form of analytics, aimed at answering the question: “What happened?”. Its goal consists in summarizing and describing historical data using simple techniques like data aggregation, basic statistics and reporting. For example, reviewing last quarter’s sales figures is a typical descriptive analytics task.
  • Starting from descriptive analytics insights, diagnostic analytics addresses the question: “Why did it happen?”. Its objective is to understand the underlying causes of events and trends identified by descriptive analytics, using techniques such as drill-down and correlation analysis. A typical diagnostic task involves identifying events that are correlated with malfunctions in a plant.
  • The objective of predictive analytics is to forecast events that are likely to happen in the future based on historical data. It aims to answer the question “What is going to happen?” using techniques such as regression analysis, machine learning, and time series forecasting. For example, companies might use predictive analytics to forecast future sales based on past trends and other information.
  • Prescriptive analytics is the most advanced type of analytics, focusing on recommending actions to optimize business performance. It answers the question: “What should we do?”. Prescriptive analytics often combines predictive models with optimization algorithms to propose the best course of action. For instance, a logistics company might use prescriptive analytics to optimize its delivery routes based on traffic patterns, demand prediction, and delivery schedules.

Visual platforms for analytics and programming languages

Several tools can support analytics at different levels. Visual platforms, for instance, help make data insights accessible to a broader audience through customized interactive dashboards, charts, and reports. The ability to visualize data helps users, even those without a technical background, to comprehend complex data sets and make informed decisions. The goal of these tools is to hide complexity from the end user, showing only the relevant information.

On the other hand, programming languages like Python and R offer numerous libraries that enable the implementation of virtually every state-of-the-art analytics technique. Their goal is to provide a wide range of methods and maximum flexibility to combine them according to the user’s needs. In-depth technical skills are however required to master the complexity of programming languages.

Performing analytics on Rulex Platform

Rulex Platform was designed to overcome the limitations of both visual software and programming languages, offering the ease of the former and the flexibility of the latter. Rulex is a no-code platform that integrates several analytics tools to address the needs of different types of users.

Decision makers can use Rulex Studio to get a bird’s-eye view of their business data. This platform component allows users to create interactive dashboards and custom UIs where they can consult graphs and tables, applying the necessary filters and aggregations. They can also interact with the underlying data, launching computations, providing feedback on results, and even updating and synchronising information in seconds.

Data scientists can make the most of a wide range of data management and artificial intelligence tools offered by Rulex Factory. This platform component, where flows are built in a drag-and-drop environment, can replace the tools commonly used to manipulate data and extract models, such as Pandas and TensorFlow, improving development speed and solution usability.

It provides a comprehensive portfolio of machine learning methods, ranging from unsupervised learning (clustering, anomaly detection) to supervised learning (classification, regression), and even includes proprietary algorithms, such as the eXplainable AI Logic Learning Machine. Coding enthusiasts can also integrate their Python or R code into Rulex flows, reducing the need to switch data back and forth.

Data engineers and data integration specialists use Rulex Factory to seamlessly merge data from different sources, bringing them together in a single view. The platform supports the integration of almost any data source, from common relational databases to everyday file types such as Excel and PDF. Additionally, users can connect to data stored in remote locations, including cloud storage or HTTP servers.

Performing analytics on Rulex Platform

How Rulex can answer your questions

Regardless of the team you’re in or the data job you have, Rulex Platform can help answer all your analytics questions.

By using data management tools and building interactive dashboards, you can better understand what happened (and what is happening) to your business. Performing statistical correlations and extracting explainable machine learning models from your historical data can help you identify the root causes of your business trends, understanding why they happened. Inferring from machine learning models trained on past data allows you to predict what is more likely to happen in the future. Last but not least, several tools can support you in understanding the best action you can take.

Optimization plays a pivotal role in providing a targeted action plan, as it finds the configuration that maximizes desired results (expressed in an objective function) while fulfilling all the operational constraints. Nonetheless, optimization has two well-known drawbacks: (i) its implementation is quite complex, especially in presence of several constraints, and (ii) the computational effort to find the optimal solution can grow exponentially.

Rulex Platform has solutions for both these issues with its mathematical optimization task, Build & Solve. The task dramatically simplifies the definition of linear programming problems, speeding up the setup phase (i), and implements specific algorithms able to find a feasible solution in minutes, or retrieve sub-optimal alternatives when the optimal solution cannot be retrieved within a reasonable time-frame (ii).

Question

Rulex Platform tools that provide answers

What happened?

Rulex Studio (Dashboards)
Data Manager (Queries)
Data integration tasks (Join, Pivot Table)

Why did it happen?

Statistical analysis
Correlation plots
Explainable AI tools

What is going to happen?

Machine learning algorithms
Time Series Analysis
Explainable AI tools

What should I do?

Mixed Integer Linear Programming
Build & Solve
Rule-Based Control
Rulex Studio (Custom UI)

Hands-on analytics: the supply chain department of a CPG manufacturer

Let’s get practical and explore how Rulex Platform can assist the major players in a manufacturing company with their everyday tasks.

In the supply chain department of a large CPG (Consumer Packaged Goods) manufacturer, managers face the challenging task of ensuring that goods are sent to several distribution centers while fulfilling all requests and reducing delivery costs. Their daily responsibilities involve managing multiple datasets stored in different locations, including production data, orders, carrier information, and delivery costs. To gain a comprehensive overview of their business performance, they can use Rulex Platform to integrate all this data and create a dashboard containing relevant insights. Experts can rely on this dashboard to access historical data about orders, production, and transportation, updating it with fresh information as needed.

The company’s data scientists, in turn, can employ Rulex’s statistical functions and machine learning techniques to analyze past behaviors and predict future outcomes. For instance, they can investigate why a shortage occurred by using the Logic Learning Machine, which generates if-then rules correlating the outcome (e.g., the shortage) with its potential causes. They can also forecast the demand for various items from logistics centers in the coming days, weeks, or months using time series models like ARIMA in the Auto Regressive task.

Finally, and most crucially, Rulex Platform can recommend specific actions to reach target goals. After using Rulex to process data and forecast missing variables (such as demand), planners can use production and demand insights to determine what, when, and where to ship.

Rulex’s prescriptive tools, such as optimization tasks, guide the distribution of goods. Rulex Platform’s Network Optimizer and Transport Optimizer tasks are specifically designed to address this challenge. Their outcome is a list of goods together with the quantity to be sent in each location and the optimal configuration to perform the deliveries reducing costs and environmental impact.

What should you do with all your data?

If you feel you are not making the most of your data, maybe it’s time to explore Rulex Platform and its rich suite of advanced data management and analytical tools.
Request a 30-day free trial now.

Rulex Platform
Enrico Ferrari - Head of R&D Projects

enrico ferrari

Head of R&D Projects
]]>
Mathematical Optimization: How to Avoid Getting Stuck in Rabbit Holes https://www.rulex.ai/mathematical-optimization-how-to-avoid-getting-stuck-in-rabbit-holes/ Tue, 29 Oct 2024 08:00:55 +0000 https://www.rulex.ai/?p=246993

Running a business involves navigating a variety of challenges, many of which can be tackled effectively through mathematical optimization. Yet, without recognizing these challenges as such, business experts may get sidetracked by less effective methods, ultimately going down technological rabbit holes.

Even when optimization is identified as the right approach, translating a business problem into a mathematical model can be intimidating, adding another layer of complexity. After all, implementing an optimization problem is no simple task. Alright, so how do we get past these roadblocks?

This article will explore how emerging tools are simplifying these traditionally complex processes, making optimization more accessible and practical for everyone.

What mathematical optimization actually is, why it is important for companies, and why it’s so damn complicated

Optimization is the process of enhancing operational efficiency and performance by identifying the best-fit solution for a specific business scenario. It typically involves setting a well-defined goal, such as reducing operational costs or increasing revenue, while considering many factors and navigating multiple constraints.

In the context of logistics, for example, a company must determine the optimal combination of shipments while considering factors such as delivery times, costs, and availability of drivers and goods. Constraints can be soft, such as preferable delivery dates, which may allow flexibility albeit with penalties, or hard, such as insufficient inventory levels, which cannot be violated.

There are many ways these factors can be combined to reach the desired outcome, and optimization tools aim to find the optimum, or near-optimum, solution in order to maximize efficiency.

Modeling ➡ Solving

Modeling

This phase involves translating the business problem into mathematical formulas, which are then converted into a constraint matrix for the solver algorithm to process and find the optimal solution. Formulating business problems mathematically is a highly complex and time-consuming task. Additionally, transforming these formulas into a constraint matrix generally requires programming skills.

Solving

At this stage, the solver algorithm attempts to find an optimal feasible solution based on the constraints defined earlier. However, in most real-world problems, current solver algorithms on the market can take hours or even days to find a solution – and sometimes, they don’t succeed at all. This often happens when there are too many hard constraints, making an “optimal” solution virtually unattainable.

The most frustrating part is that when a feasible solution cannot be found, these algorithms provide no explanation, offering no insight into which constraints could be adjusted to achieve a solution. They operate like a “black box” that simply rejects the input, leaving us to start over from scratch.

How Rulex revolutionizes mathematical optimization

Rulex has just released a new advanced optimization tool that revolutionizes the traditional approaches, reducing time and complexity in both the modeling and solving phases. A part of Rulex Platform’s suite, Build & Solve is a powerful optimization task where business experts can define the problem along with its hard and soft constraints (Build), then apply an advanced algorithm to find a solution (Solve), all within a reasonable time frame and without the need for advanced technical skills.

In the Modeling (or Build) phase, no mathematical formulas are needed. Users can define problems using logical syntax within familiar spreadsheets, while the Build & Solve task itself automatically generates the constraint matrix from the business data and spreadsheets.

How Rulex revolutionizes mathematical optimization

This is already a significant advantage. But there’s more.

In the Solving phase, the Build & Solve algorithm is extremely rapid. While traditional tools may take hours or even days, Rulex can find a feasible solution within seconds or minutes. If a solution is not found on the first attempt, the tool identifies which specific constraints are preventing a feasible solution, thus allowing experts to quickly assess and address the issue. This approach significantly reduces the need for repeated, time-consuming calculations and accelerates the path to a viable solution.

In addition to offering all the essential tools for gathering and pre-processing data, Rulex Platform also provides comprehensive capabilities for post-processing and visualization, allowing results to be presented to end-users exactly as and where they need them.

n the Solving phase, the Build & Solve algorithm is extremely rapid

Real-world applications of Build & Solve

Build & Solve is not some future proof-of-concept; it’s already in production at numerous global companies, particularly in supply chains and manufacturing, where it is efficiently optimizing processes as we write. One of its most successful applications is in scheduling, ensuring that tasks and actions within a plan align with specific business objectives.

Scheduling optimization:

The tool was used to develop a solution for a Fortune 50 company, optimizing production efficiency by simultaneously tackling both product and packaging planning. This was a highly complex challenge, given the diverse constraints – ranging from managing different batch sizes to coordinating multi-stage orders and machines that cannot operate simultaneously.

The company’s experts were able to define the problem using logical syntax in spreadsheets, avoiding the need for complex mathematical formulas. Thanks to the tool’s user-friendly interface and rapid processing, the solution was deployed in under a month. Now, production planning can be optimized in less than 10 minutes, significantly outperforming traditional manual methods in both speed and accuracy, resulting in a threefold increase in production efficiency.

More in-house optimization tools

While Build & Solve serves as a general purpose optimization tool, Rulex also offers specialized tasks for specific supply chain and logistics challenges, including warehouse and transport optimization. With these tools, the problem framework is already established; users simply need to customize it with their specific case details and data.

Real-world applications of Build & Solve

Avoid getting stuck down rabbit holes

Tailored around business needs, Build & Solve offers extremely rapid and efficient problem resolution, so you don’t get stuck down rabbit holes.

Do you have a business challenge that feels insurmountable? Bring your case to us, and together we’ll find a solution. Schedule a free consultation with one of our experts today!

Optimize process efficiency for peak performance

Rulex Platform
]]>
Decision Intelligence Platforms: Stop Taking Crappy Decisions https://www.rulex.ai/decision-intelligence-platforms-stop-taking-crappy-decisions/ Wed, 31 Jul 2024 07:00:28 +0000 https://www.rulex.ai/?p=245677

How many decisions do we make every day? And how much effort is involved in making informed and logical choices? In this fast-paced world, millions of decisions must be made every minute, from sending an email to deciding how much coffee to stock for your café or determining whether the insurance claim you are reviewing involves fraud. According to a recent survey, 65% of decision-makers find making decisions more complex than it was just two years ago, and 53% said they face more pressure to explain or justify their decisions.

Sometimes, even the old-fashioned, yet still extremely valuable, approach based on experience doesn’t pay off, becoming more of a gut feeling than anything else. This happens because of the overwhelming amount of information to process and consider in order to make these decisions, as well as the complex dependencies of the factors that have led to them.

So, if some of your decisions are not as sharp as you thought, first, it is not entirely your fault, and second, technology can provide tools to improve and augment your decision-making.

What is decision intelligence?

Decision intelligence (DI) combines cutting edge technologies such as AI and mathematical optimization with human expertise to create a cohesive framework for answering key questions, such as: What will be the outcome if I take this action today, given the current context?; What actions should I take now to maximise the likelihood of achieving my goals?; What strategies can I employ to reduce costs?

Listed in their top 10 strategic technology trends of 2024, Gartner defines DI as a “practical discipline that improves decision making by explicitly understanding and engineering how decisions are made, and outcomes evaluated, managed and improved by feedback”. DI is, therefore, no different from the decision flow that you and your team create when making a decision during a meeting.

But, is DI just a buzzword? McKinsey predicts that 70% of businesses will rely on decision intelligence by 2030. So, it may be worth having a further look into the topic.

How is decision intelligence different from business intelligence?

The first question that may arise when hearing about DI is how it differs from business intelligence (BI) – the practice of presenting relevant, historical business information and querying data in visual formats. Decision intelligence can be seen as BI 2.0, diving much deeper into the inner workings of data.

Suppose you want to analyze your company’s sales. While business intelligence reveals past sales trends, decision intelligence goes further by explaining why sales fluctuated and identifying which factors were less influential. This insight helps address business challenges more effectively. In other words, DI is an action-oriented practice that provides the additional knowledge needed to elevate your business operations to new levels.

How is decision intelligence different from business intelligence?

How decision intelligence platforms work

Decision intelligence platforms orchestrate large volumes of data into a unified view, leveraging cutting-edge technologies like eXplainable AI to support intelligent decisions, and then executing and automating decision flows, saving you time.

These platforms must be designed to scale solutions easily and incorporate human expertise and feedback. The key is that decisions made using decision intelligence platforms should not only be superior in quality, accuracy, and effectiveness but also augmented by a hybrid system of advanced technologies and human judgement. To facilitate this human-machine collaboration, transparency is crucial, offering clear insights into decision-making processes, so users can understand and evaluate any automated decisions.

“Inform with accurate data, decide with advanced and explainable tools, and repeat with automation” is the mantra of good decision intelligence. Listed among the top decision intelligence platforms by Gartner on the market, Rulex is an innovative platform system where these three phases are all integrated into the same workspace. This means you can create solutions to improve decision-making starting from your in-house data and following the entire process step by step.

How decision intelligence platforms work

Collecting information with agility

Call it what you want; we call it data agility, meaning the capacity to collect, aggregate, and harmonize data smoothly from any source and format. But Rulex Platform’s agility capabilities do not stop there.

It also ensures your data is of top quality through a combination of data quality technologies, from traditional data cleansing to rule-based validation and AI-driven self-healing. Not only can these multiple approaches literally tackle any data quality issue, but, equally important, they can be handled and applied by citizen developers, without the need to constantly stress the IT team for help and support.

Collecting information with agility

A full decision intelligence toolkit

When data preparation is done, let the fun begin. Decision making is not only about data: it’s about reaching your organization’s goals, with data as a key ingredient.

Rulex provides a comprehensive suite for an outcome-based approach. Whether optimizing logistics, making informed credit scoring decisions, or gaining insights into your product landscape, virtually any solution can be built in Rulex Platform. Additionally, its WYSIWYG graphical interface and transparent technology enable users without technical skills to design solutions independently.

Native eXplainable AI – Generates output in the form of understandable if-then rules, leading to peak performance while ensuring transparency, compliance, and adherence to ethical AI standards. Rulex’s proprietary XAI technology has resulted in multi-million-dollar savings for global corporations in areas such as customer loyalty and fraud detection.

Mathematical optimization – Finds optimal solutions for complex business scenarios by managing intricate physical and logical constraints defined in common spreadsheets. Global supply chains use this tool to minimize costs and maximize revenues in areas such as deployment, dock, and transport optimization, and even simultaneously optimizing multiple areas like warehousing and work shifts.

Business rule engine – Serves as a powerful tool for defining, enhancing and applying business rules to data. It allows users to write business rules in a simple Excel file using intuitive syntax. It has been used for advanced master data validation, lending and credit risk management, and strategic supplier diversification.

What-if scenario simulator – Forecasts the behavior of complex processes and recommends precise modifications to achieve desired outcomes without costly real-life experiments. It has been employed to reduce energy consumption, predict equipment maintenance, and improve retention strategies.

AutoML – Simplifies and accelerates data analytics through a user-friendly, guided workflow. By automating time-consuming tasks like data preparation, pre-processing, and feature selection, it facilitates seamless experimentation with multiple machine learning models. It is widely used by global financial organizations for applications such as credit scoring, wealth management, product cross-selling, and up-selling.

Decisions are human, repetition can be automated

Once you have built your decision flow and are satisfied with it, the final step is to automate it. Automation ensures smooth data flow across systems, providing up-to-date analytics, and streamlining company processes, ultimately saving time. Rulex Platform provides end-to-end workflow automation, covering everything from data collection and integration to analysis and reporting.

Finally, while many software solutions struggle to integrate with existing infrastructure without causing major disruption, Rulex excels. Both the cloud and server versions of Rulex offer a comprehensive catalog of REST APIs, facilitating smooth connections with multiple systems and services. To further enhance collaboration and efficiency, Rulex Platform provides all the necessary features to support the creation of DevOps pipelines, integrating the software with existing tool-chains.

Empowering your decisions

Making decisions is not new; we, as humans, have done it for millions of years. However, the uncertainty of today’s market and geopolitical landscape demands a quicker pace. Making faster and better decisions is crucial for maintaining a competitive edge. Rulex Platform can be your trump card, empowering your company while seamlessly integrating with your existing infrastructure and systems.

Discover Rulex Platform’s decision intelligence toolkit

]]>
Weeding the Data Garden: how Rulex Platform Cultivates Quality https://www.rulex.ai/weeding-the-data-garden-how-rulex-platform-cultivates-quality/ Tue, 04 Jun 2024 07:00:09 +0000 https://www.rulex.ai/?p=243969

Whether it’s an out-of-range value or an incorrect format, the quality of data is fundamental to any data-reliant process and significantly impacts the results, despite often being overlooked. Imagine an enterprise which decides to undergo a large end-to-end business transformation program, where the final aim is to switch to the newest APS featuring so many fancy capabilities. If the data provided to that software is not consistent and accurate, the results will be adversely impacted.

So, we need to monitor quality in order to ensure adequate accuracy; and to monitor quality we need to… define what Data Quality is. In fact this is rather an “umbrella term” to refer to different issues in the data: Accuracy, Completeness, Consistency, Timeliness, Validity and Uniqueness are some Data Quality dimensions. You can find more information (and more dimensions!) by googling around, so here we rather focus on the solution, which, as the problem, is also multi-faceted.

Rulex Platform provides different capabilities and approaches to solve different Data Quality issues in the same way as a gardener uses various tools and practices to uproot the weeds and make his garden bloom.

Weeding the Data Garden: How Rulex Platform Cultivates Quality

When it comes to harmonizing fragmented data, handling missing values and duplicates, and formatting errors or outliers, Rulex Platform can quickly spot and correct the issue.

A typical Rulex flow foresees these cleansing activities as one of the first actions performed on the raw and dirty dataset. Specific tasks are available which make it simple for any citizen developer to cleanse their data, such as:

  • Fill & Clean: which imputes missing data with fixed or dynamic values
  • Data Manager: which spots and dismisses duplicate rows with a single click

And if these are not enough you can leverage various other features such as:

  • An advanced join capability to merge different datasets based, for example, on string similarity
  • Statistical or textual Data Manager functions, which deal with outliers or incorrect formats
  • …and much more!

Albeit the above approaches have proven useful with many basic issues, there are some cases where a data value seems pretty normal and yet hides an inconsistency.

Unmask inconsistency with eXplainable AI

Among all these dimensions, consistency is one of the most difficult to deal with. A target attribute is considered “consistent” if it changes in accordance with its related attributes; i.e., its values change consistently when the context changes.

The table below illustrates an example of inconsistency (guess why!):

Name
Age
Married
John
28
Yes
Mike
32
No
Paul
5
Yes
Brenda
54
Yes

Also, sometimes you know that a subset of your data is inconsistent, but you don’t have the proper rules to correct it.

Or you have some basic rules, but there are so many exceptions that the final correct values can hardly be identified.

Rulex approaches all the above scenarios with a disruptive solution called Robotic Data Corrections (RDC), which seamlessly provides correction proposals to inconsistent data.

Behind the magic there is a proprietary eXplainable AI algorithm called Logic Learning Machine, capable of inferring a ruleset according to which proposals are devised. With this approach, the user simply accepts or rejects recommendations according to their domain knowledge. The algorithm integrates this new knowledge into successive iterations. After four to five iterations, the accuracy is usually close to 99%.

In addition, RDC catches any new issues in data quality associated to material “phasing in”: at a steady state, minimum effort is required to attain the highest levels of accuracy.

But as we mentioned, the realm of Data Quality is complex and the issue types are diverse: sometimes dependencies from driving attributes involve mathematical formulations, or sometimes even if you do have a settled ruleset, it is not easy to update it. Or maybe the dependency between rules is too complex to manage.

Luckily, the realm of the solutions provided by Rulex is also diverse.

Ignite your rules with the Rule Engine

Rulex provides a task which allows any citizen developer to write their rules with a simple syntax in a simple spreadsheet, import this rule file, and apply the rules to a dataset. This empowering task is called the “Rule Engine”.

The beauty of this approach is that any existing rules can be coded in the task: from the simplest rules to rules involving complex conditions or output values resulting from complex mathematical or logical functions. Also, the whole process of ensuring data quality is completely in the hands of the citizen developer, without needing to resort to skilled expertise to modify the rules or create new ones (definitely shortening the time-to-value).

Finally, our Product Team is working on a solution for those unsure if all the rules are properly configured.

Sharpen your rules with the Rule Enhancer

The Rule Enhancer is an innovative task which refines existing rules: think of it as a tuning tool. It requires a data (sub)set which contains clean and accurate values (the so-called “ground truth”), used to adjust the rules. It also requires some sort of performance criterion (such as the F1 score); as a result, fine-tuned parameters are provided for each rule. If you are interested just hang in there a bit longer: the task will be released in the short!

Sharpen your rules with the Rule Enhancer

Let your data bloom

These multiple approaches together constitute the basis for a 360 degree solution that reaches top accuracy levels, and which can be applied in a comprehensive Data Quality pipeline, so that any kind of issue can be tackled and solved. And what’s more: the implementation can be proficiently managed by any citizen developer who well understands the underlying data.
Rulex Platform provides all the solutions needed to make your knowledge blossom into colourful, accurate data.

Discover Rulex Platform’s data quality solutions

Rulex Platform
]]>
Exploring the AI ACT: Transparency as the Key for Future Technology https://www.rulex.ai/exploring-the-ai-act-transparency-as-the-key-for-future-technology/ Thu, 29 Feb 2024 09:00:22 +0000 https://www.rulex.ai/?p=241514

Nowadays, artificial intelligence (AI) systems are seamlessly integrated into our daily lives, making tailored suggestions and influencing our decisions. While artificial intelligence offers incredible advantages, it is necessary to address the potential issues of bias, discrimination, and privacy associated with its use.

In response to its pervasive influence in contemporary society, various initiatives have been pursued, notably those headed by the European Union (EU). The EU’s landmark Artificial Intelligence Act (AIA) represents a robust regulatory architecture designed to address the challenges posed by AI.

The Artificial Intelligence Act is the first law to harmonize, regulate, and restrict the use of artificial AI in Europe. It is expected to enter into force in 2024.

A risk-based approach

A crucial aspect of the EU AI Act is its risk-based methodology. The greater the risk associated with the use of a specific artificial intelligence system, the greater the responsibilities for those who use or provide it. This can extend to a prohibition on the use of systems deemed to have an unacceptable level of risk, thereby emphasizing individual rights and model transparency.

Indicatively, the classification includes the following risk levels:

  • Minimum/low risk: systems with minimal risk to people’s safety and fundamental rights should be subject to transparency obligations, ensuring a basic level of clarity and comprehension.
  • High-risk AI: systems whose application could have substantial implications, potentially leading to harm. Consequently, they are subject to stringent regulations aimed at mitigating bias and discrimination. Identifying risks and implementing corresponding mitigation strategies is imperative across the entire life cycle of these AI systems. Thus, ensuring transparency becomes essential for interpreting results and facilitating proper oversight of the decision-making process. In fact, the Artificial Intelligence Act stipulates that high-risk AI systems are subject to a number of requirements and obligations, such as the adoption of necessary technical documentation, transparency of information, adequate levels of cybersecurity, etc.
  • Unacceptable risk: any AI system which is considered a direct threat to fundamental human rights, and is consequently prohibited.

Furthermore, guidelines and standards have been implemented for both basic AI systems, necessitating clear disclosure when individuals interact with them, and general purpose AI systems (GPAI), whose capability to operate across market sectors carries the risk of having a systemic negative effect on the society as a whole.

The importance of transparency

Meeting the stringent transparency requirements of the new AI Act could be extremely challenging – if not impossible – with traditional AI technologies.

For example, one of the crucial applications of AI is in the realm of credit rating systems, where it empowers banks to examine vast sets of customer data for accurate evaluations of creditworthiness. Considering that these systems provide a perspective on an individual’s financial standing by examining not only financial indicators, but also spending habits and behavioral patterns from diverse sources, ensuring fairness in the process is of paramount importance.

Explainable AI (XAI) is a facet of artificial intelligence that can produce clear results and provide the rationale for its predictions and subsequent decisions, consequently enhancing accountability and acting as a safeguard against the influence of bias and discrimination

Rulex’s XAI vision

Rulex’s journey began in the 1990s, fueled by a singular mission: to make AI explainable while maintaining its accuracy and speed. For the past two decades, its ground-breaking eXplainable AI has remained focused on addressing these very challenges within the data management process.

Central to this achievement is the Logic Learning Machine (LLM), an algorithm developed by Rulex’s founder. The innovation lies in its ability to articulate explicit and straightforward rules, presented in a logical if-then structure. This approach mirrors the cognitive processes of the human brain, ensuring a transparent and traceable workflow.

This commitment not only ensures compliance with GDPR and other privacy regulations but also lays a solid groundwork for the impending implementation of the AI Act.

Benefits of XAI

  • Trust: Establishing transparency in decision-making is essential to cultivate a trusting relationship with all stakeholders. Business experts can effectively grasp and articulate the decision-making process, utilizing eXplainable AI systems to reassure the involved parties
  • Compliance: XAI can assist companies in identifying and utilizing only the strictly necessary and crucial information from extensive datasets, thereby reducing certain risks associated with their management. In this way, actions are taken in compliance with regulations and in respect of individuals’ privacy
  • Responsibility: The transparency and traceability of XAI ensure decisions are made without relying on discriminatory influence, thereby imposing a greater sense of accountability and responsibility on users.

A transparent credit rating solution

Over the years, Rulex has applied eXplainable AI principles within the financial services sector, developing numerous solutions ranging from fraud detection to NPL management and churn prevention.

Among these, Rulex’s credit rating solution serves as a prime example of the improved comprehension that our native XAI can offer regarding the underlying process logic.

This solution integrates a decision-making workflow that comprehensively covers every stage of the product lifecycle, from automated score calculation to rating assignment and continuous performance monitoring.

Rulex’s XAI algorithm generates intuitive if-then rules identifying the distinctive features of each rating class, enabling the classification of new cases.

These clear predictions allow experts to confidently make well-informed decisions and effectively communicate them to clients, all while mitigating bias and promoting fairness.

*At the time of writing this article, the final text of the Artificial Intelligence Act is still awaiting approval.

Discover more about Rulex for financial services

Rulex Platform
]]>
Rule-based Validation: 3 Reasons Why Rulex Does It Better https://www.rulex.ai/rule-based-validation-3-reasons-why-rulex-does-it-better/ Wed, 28 Feb 2024 08:00:34 +0000 https://www.rulex.ai/?p=241265

On September 23, 1999, at 09:00:46 UT, the NASA spacecraft Orbiter lost contact with Earth as it passed behind Mars. The anticipated reconnection, 27 minutes later, never occurred – by that time, the spacecraft had crashed onto the red planet. Subsequent investigations revealed that the incident was caused by commands not being converted from English units to the metric standard.1

On April 8, 2018, a Samsung Securities worker inadvertently entered “shares” instead of the Korean currency “won” due to a keyboard error. This led to the accidental distribution of a “ghost” share worth over 100 billion dollars, ultimately causing a significant decline in Samsung stocks, not to mention a loss of credibility.2

What ties these incidents together? Data quality.

Data quality matters

Ensuring data quality involves tasks such as checking if values are within range or have the correct format, and it has been the center of many discussions since the early 1990s.3

Data quality issues may originate in the realm of data, but are certainly not confined to it, significantly impacting business efficiency, incurring higher costs and even jeopardizing the success of projects.4

To tackle the intricacies of data quality problems, organizations of all kinds are constantly looking for effective solutions that can combine both industry expertise and data knowledge.

Will a spreadsheet cut it?

While spreadsheets may suffice for small datasets with simple rules, they prove inadequate as data volume and rule complexity increase. Suppose you have only one or two data sources that you can merge into a small, unified dataset. If the data quality can be assured with simple rules, such as verifying payment amounts within an expected range, a basic spreadsheet formula might suffice.

However, as business requirements grow more intricate, data volume expands, or the need arises to integrate new sources, spreadsheets really start to feel the strain, along with the people trying to use them. Similar to training wheels for a novice rider in a park, a spreadsheet is of little use to an experienced rider navigating a steep downhill track.

a spreadsheet is of little use to an experienced rider navigating a steep downhill track

Find the expert, spell it out, iterate

So with increased complexity, you’ll need a data quality tool that can handle it. Unless you have a very technical background, you’ll also have to get an expert onboard who can implement your rules, such as a Python programmer.

The sort of script your programmer could produce to perform a simple validation check, such as ensuring an amount lies within the 10,000 to 50,000 range, applicable only to projects categorized as “small” or “medium” in size, would look something like this:

  1. import pandas as pd
  2. data = {
  3.     ‘amount’: [12000, 30000, 60000, 15000],
  4.     ‘project’: [‘Small’, ‘Medium’, ‘Large’, ‘Small’]
  5. }
  6. Payments = pd.DataFrame(data)
  7. Payments[‘PaymentStatus’] = ‘INVALID’
  8. mask = ((Payments[‘amount’].between(10000, 50000)) & (Payments[‘project’].isin([‘Small’, ‘Medium’])))
  9. Payments.loc[mask, ‘PaymentStatus’] = ‘VALID’
  10. print(Payments[[‘PaymentStatus’]])

Using an expert to implement the solution is presumably a viable approach, as it allows you to handle volume and complexity. However, it has some important drawbacks:

  • As the execution of any implementation is not within your control, adapting to changes in requirements can be a bit of a journey, involving scheduling meetings to coordinate with programmers and/or tool specialists.
  • Despite investing time in clarifying these changes, there’s always a chance that not every detail will be fully grasped or smoothly executed.
  • And when it comes to integrating new data sources and ensuring they seamlessly align with existing datasets, things can get even more intricate. This can lead to a quick escalation in the effort required, calling for a diverse set of skills to merge and harmonize everything effectively.

The perfect solution would be a tool that can handle high data volumes and varying rule complexities while remaining accessible to a citizen developer.

Meet the Rule Engine

At Rulex, we address data validation challenges with a task called the “Rule Engine“.

This specially designed tool allows users to write business rules in a simple Excel file using an intuitive syntax. The rules can be applied to datasets, and the outputs can be exported to various formats, such as a database, a local file, or via API to an Advanced Planning System.

To assess the validity of our payment data with the Rule Engine, instead of writing a script, it’s sufficient to write a straightforward rule like the following:

  1. IF “amount” > 10000 AND “amount” < 50000 AND “project” in {'Small', 'Medium'} THEN "PaymentStatus" in {'VALID'}

As these rules are written in an external spreadsheet, business users can independently add and modify them, without delving into the intricacies of the workflow, or even needing to know how the software works.

Managing business rules becomes seamless. If the complexity grows, it can be easily addressed thanks to the Rule Engine’s support for formulas within the rule syntax, prioritization of rules (executing fundamental rules first), and the ability to manage rule dependencies.

And if new data sources come into play, they can be imported and merged into the existing flow through a user-friendly drag-and-drop interface.

3 main benefits of the Rule Engine:

  1. SIMPLE: You won’t need to onboard programmers to write complex scripts.
  2. FAST: You can independently modify and test rules and check results in minutes.
  3. FLEXIBLE: You can quickly add new data sources, prioritize rules, and change output, adapting easily to changing needs.

Whether mitigating a space exploration mishap or simply ensuring your business is not losing money, data quality is crucial. The Rule Engine is designed to give citizen developers complete control over the rule management process, enhancing efficiency and contributing to the vigilant maintenance of optimal data quality.

Now is the right time to cast aside those training wheels and confidently navigate your own path along the data trail!

]]>
Explainable AI in Life Sciences https://www.rulex.ai/explainable-ai-in-life-sciences/ Tue, 20 Feb 2024 08:00:45 +0000 https://www.rulex.ai/?p=240790

While AI offers significant potential in life sciences, its implementation comes with several challenges, ranging from the pure size of medical databases, to mandatory regulatory compliance and the ethics of using black-box models in medical decision making.

Rulex Platform’s eXplainable AI has a profound impact on the implementation of AI in this sensitive sector, by producing transparent, human understandable results. This transparency enables medical experts to understand and explain any predictions made, while guaranteeing ethical data models and results, and adherence of privacy regulations. Simple interpretability is essential for gaining trust and understanding the rationale behind medical decisions, and enables a healthy balance in human-AI collaboration.

Rulex Platform can also easily gather, aggregate and analyse extremely large datasets in any format, and from any source, while integrating with underlying information systems, such as electronic health records, or laboratory information management systems, without causing disruption and upheaval. Results can also be produced in any format required, whether that is an e-mail with urgent results, a tailored spreadsheet saved on a common server, or an interactive dashboard to show colleagues.

For its inherent explainability and agility in data management, Rulex Platform has been chosen by medical healthcare and life sciences organizations to leverage medical records, resulting in improved health outcomes, enhanced clinical and operational decision-making, and pioneering research.

1. Improving Data Quality in Hospital Discharge Reports

Health check systems in Italy are overseen by regional and local health authorities, who actively monitor and regulate the quality of healthcare services to ensure their appropriateness. Over time, numerous Italian regions have developed and revised guidelines and operational procedures aimed at scrutinizing hospital discharge reports and medical records.

The significance of accuracy in medical records cannot be overstated, as errors can lead to various repercussions, ranging from minor billing discrepancies to critical issues such as incomplete or incorrect diagnoses, or delays in scheduling surgical interventions.

In collaboration with Deimos, Rulex leveraged their eXplainable Artificial Intelligence (XAI) technologies to automate the scrutiny of coding in hospital discharge forms within the Alto Adige health authority. The primary focus of the study was to assess the feasibility of applying automatic checks, characterized as logical clinical checks, not only to ensure compatibility between sex-diagnosis or age-diagnosis, as traditionally done with formal logical checks, but also to explore the intricate relationships between clinical variables in hospital discharge reports. This approach aimed to automatically identify inconsistencies among diagnosis, surgery, medical procedures, and Diagnosis-Related Groups (DRGs).

The tested methodologies yielded promising results. Validation rules were defined, resulting in improved efficiency in automatic record checks and identification of probable location of errors, the personnel time required for record checking was significantly reduced, and automatic checks were carried out on all surgical hospital discharge records, not only a test subset.​ Overall, the innovative approach not only enhanced the precision of existing checks but also introduced a more comprehensive and nuanced evaluation of the relationships within medical records.

Related research paper (in Italian):

2. Tailoring Diagnostic Predictions for Primary Biliary Cholangitis

Precision medicine seeks to customize the diagnosis, monitoring, and management of individuals based on their unique genetic and environmental backgrounds. This undertaking is particularly challenging due to the intricate nature of medical traits and the presence of multiple variants. The complexity is further amplified when addressing rare diseases, where limited historical data poses an additional hurdle.

In collaboration with the medical departments of Milano-Bicocca and Humanitas universities, Rulex conducted a pioneering study to assess the feasibility and precision of predicting the risk of Primary Biliary Cholangitis (PBC) using eXplainable AI (XAI). The focus was on identifying novel patient subgroups, disease sub-phenotyping, and risk stratification.

The XAI algorithm was applied to an extensive international dataset of PBC patients, divided into a training set, with 11,819 subjects, and a validation set, with 1,069 subjects, with a meticulous analysis of key clinical features. The primary outcome was a composite of liver-related death or liver transplantation, assessed through a combination of machine learning and standard survival analysis.

The analysis revealed four distinct patient clusters, each characterized by unique phenotypes and long-term prognoses. These findings represented a pivotal milestone in formulating a targeted treatment approach for PBC. Additionally, they laid the foundation for ongoing efforts in identifying and providing timely treatment for the relatives of patients, confirming the potential of XAI in advancing precision medicine for complex diseases.

Related research paper:

  • Alessio Gerussi, Damiano Verda, Davide Paolo Bernasconi, Marco Carbone, Atsumasa Komori, Masanori Abe, Mie Inao , Tadashi Namisaki, Satoshi Mochida, Hitoshi Yoshiji, Gideon Hirschfield, Keith Lindor, Albert Pares, Christophe Corpechot, Nora Cazzagon, Annarosa Floreani, Marco Marzioni, Domenico Alvaro, Umberto Vespasiani-Gentilucci , Laura Cristoferi, Maria Grazia Valsecchi, Marco Muselli, Bettina E Hansen, Atsushi Tanaka, Pietro Invernizzi, Machine learning in primary biliary cholangitis: A novel approach for risk stratification, Wiley, Dec 2021.

3. Identifying Correlations with XAI to Improve Metabolic Control in Type 2 Diabetes

One of the primary goals of diabetologists is to establish an effective metabolic control in type 2 diabetes patients, measured through hematic levels of HbA1c, without causing weight gain.​

The Italian diabetology association used Rulex’s proprietary XAI to extract and rank the factors most strictly associated to reducing HbA1c levels. The study involved vast amounts of raw data, including the medical records of 2 million diabetic patients, and the data collected from medical visits over a 10-year period, with over 137 variables per patient.​

Significant correlations were identified, such as the use of specific receptor agonists, while it was established that HbA1c and weight-gain have different determinants. ​These results lead to more efficient patient care for diabetic patients.

Related research paper:

4. Extracting Rules to Diagnose Pleural Mesothelioma

Malignant pleural mesothelioma (MPM) is a rare and highly lethal tumor, with its incidence rising rapidly in developed countries due to past asbestos exposure in various environments. Accurate diagnosis of MPM faces challenges, as atypical clinical symptoms often lead to potential misdiagnoses with other malignancies (especially adenocarcinomas) or benign inflammatory or infectious diseases (BD) causing pleurisies. While cytological examination (CE) can identify malignant cells, a notable false negative rate may occur due to the prevalence of non-neoplastic cells. Additionally, a positive CE result alone may not distinguish MPM from other malignancies.

Various tumor markers (TM) have proven to be valuable complementary tools for MPM diagnosis. Recent studies focused on three tumor markers in pleural effusions: soluble mesothelin-related peptide (SMRP), CYFRA 21-1, and CEA. Their concentrations were analyzed in association with the differential diagnosis of MPM, pleural metastasis from other tumors (MTX), and BD. SMRP demonstrated the best performance in distinguishing MPM from both MTX and BD, while high CYFRA 21-1 values were linked to both MPM and MTX. Conversely, elevated CEA concentrations were primarily observed in patients with MTX. Combining information from the three markers and CE could form a classifier to separate MPM from both MTX and BD.

In this context, the Rulex Logic Learning Machine (LLM) was employed for the differential diagnosis of MPM by identifying straightforward and understandable rules based on CE and TM concentrations. Comparative analyses with other supervised methods, including Decision Trees, K-Nearest Neighbors, and Artificial Neural Networks, revealed that LLM consistently outperformed all competing approaches.

Related research paper:

5. Extracting a Simplified Gene Expression Signature for Neuroblastoma Prognosis

The outcome of cancer patients is, in part, influenced by the gene expression profile of the tumor. In a prior study, a 62-probe set signature (NB-hypo) was identified for detecting tissue hypoxia in neuroblastoma. This signature effectively stratified neuroblastoma patients into good and poor outcome groups. Establishing a prognostic classifier was crucial for grouping patients into risk categories, aiding in the selection of tailored therapeutic approaches.

To enhance the accuracy of predictors and create robust tools for clinical decision support, novel classification and data discretization approaches were explored. In this study, Rulex was employed on gene expression data, specifically using the Attribute Driven Incremental Discretization technique to transform continuous variables into simplified discrete ones. This pre-processing step facilitated rule extraction through the Logic Learning Machine (LLM). The application of LLM yielded 9 rules, primarily based on the relative expression of 11 probe sets. These rules proved highly effective as predictors, validated independently and confirming the efficacy of the LLM algorithm on microarray data and patient classification.

The LLM demonstrated efficiency comparable to Prediction Analysis of Microarray and Support Vector Machine, surpassing other learning algorithms like C4.5. Rulex conducted feature selection, resulting in a new signature (NB-hypo-II) comprising 11 probe sets, identified as the most relevant in predicting outcomes. This comprehensive approach underscores the potential of utilizing LLM in the development of reliable prognostic classifiers for cancer patients.

Related research paper:

6. Extracting Intelligible Rules in Neuroblastoma Prognosis

Neuroblastoma, the most common pediatric solid tumor, poses a significant challenge as approximately fifty percent of high-risk patients do not survive treatment. The urgent need for improved stratification strategies led to the exploration of new, more effective approaches. Hypoxia, characterized by low oxygen tension in poorly vascularized tumor areas, is associated with a poor prognosis. This study aimed to develop a prognostic classifier for neuroblastoma patients by integrating existing knowledge of clinical and molecular risk factors with the NB-hypo signature.

The focus was on creating classifiers that produce explicit rules easily applicable in a clinical setting. The Logic Learning Machine, known for its accuracy, seemed promising for achieving the study’s objectives. The algorithm was employed to classify neuroblastoma patients based on key risk factors: age at diagnosis, INSS stage, MYCN amplification, and NBhypo. The algorithm successfully generated clear classification rules that aligned well with established clinical knowledge.

To enhance stability, an iterative process identified and removed examples causing instability in the rules from the dataset. This refined workflow resulted in a stable classifier highly accurate in predicting outcomes for both good and poor prognosis patients. The classifier’s performance was further validated in an independent dataset. Notably, NB-hypo emerged as a crucial component of the rules, demonstrating a strength comparable to tumor staging. This comprehensive approach showcases the potential of the Logic Learning Machine in developing a robust prognostic classifier for neuroblastoma patients.

Related research paper:

7. Validating a New Classification for Multiple Osteochondromas Patients​

Multiple osteochondromas (MO), formerly recognized as hereditary multiple exostoses (HME), is an autosomal dominant disorder marked by the development of benign cartilage-capped bone growths known as osteochondromas or exostoses. Despite various clinical classifications proposed, a consensus remains elusive. This study aimed to validate an “easy-to-use” tool, employing a machine learning approach, to categorize MO patients into three classes based on the number of affected bone segments, the presence of skeletal deformities, and/or functional limitations.

The proposed classification, assessed through the Switching Neural Network underlying the Logic Learning Machine technique, demonstrated a highly satisfactory mean accuracy. A comprehensive analysis of 150 variables across 289 MO patients facilitated the identification of ankle valgism, Madelung deformity, and limitations in hip extra-rotation as distinctive features (“tags”) of the three clinical classes. In summary, the proposed classification offers an effective system for characterizing this rare disease, enabling the definition of homogeneous patient cohorts for in-depth investigations into MO pathogenesis.

Related research paper:

8. Predicting Obstructive Sleep Apnea in People with Down Syndrome

Obstructive sleep apnea (OSA) is notably prevalent in individuals with Down Syndrome (DS), with reported rates ranging from 55% to 97%, a stark contrast to the 1–4% prevalence in the neurotypical pediatric population. However, conventional sleep studies are often uncomfortable, expensive, and poorly tolerated by those with DS.

To address this, a dataset encompassing over 460 observations was compiled for 102 Down syndrome patients. Each patient underwent a polysomnogram, and the dataset included diverse information such as clinical visit findings, parent surveys, wristband oximeter data, urine proteomic analysis, lateral cephalogram results, and 3D digital photos.

Utilizing the Logic Learning Machine (LLM), a predictive model was developed to ascertain the occurrence of obstructive sleep apnea in individuals with Down syndrome. This approach aimed to offer an alternative to uncomfortable and costly tests like polysomnograms.

The LLM classification task successfully identified a predictive model represented by a set of simple rules, exhibiting a high predictive value of 81.5% for negative cases. Additionally, the Feature Ranking task allowed for the identification of the most relevant variables, assigning a quantitative score to their importance in the predictive model. This innovative methodology not only facilitates a more comfortable diagnosis for individuals with DS but also provides a streamlined and effective means of identifying obstructive sleep apnea.

Related research paper:

9. Benchmarking LLM Performance on Standard Biomedical Datasets

In this study, we employed Rulex’s Logic Learning Machine on three benchmark datasets related to distinct biomedical issues. These datasets were sourced from the UCI archive, a repository of data used for machine learning benchmarking. The datasets are as follows:

  1. Diabetes:
    • Objective: Diagnosing diabetes based on the values of 8 variables.
    • Patient Characteristics: All 768 patients considered are females, at least 21 years old, and of Pima Indian heritage.
    • Cases and Controls: Out of the 768 patients, 268 are effective cases of diabetes, while the remaining 500 are controls.
  2. Heart disease:
    • Objective: Detecting heart disease using a set of 13 input variables related to patient status.
    • Sample Size: The total sample comprises 270 elements, with 120 cases of effective heart disease and 150 controls.
  3. Donor/acceptor DNA:
    • Objective: Recognizing acceptors and donors’ sites in primate gene sequences with a length of 60 (basis).
    • Dataset Composition: The dataset consists of 3186 sequences categorized into three classes: acceptor, donor, and none.

The performance of the Rulex Logic Learning Machine (LLM) was compared to other supervised methods, including Decision Trees (DT), Artificial Neural Networks (ANN), Logistic Regression (LR), and K-Nearest Neighbor (KNN). The conducted tests revealed that the results obtained by LLM surpassed those of ANN, DT (which generates rules), and KNN. Moreover, LLM’s performance was found to be comparable to that of LR.

Dataset​
Records​
Inputs​
Classes​
LLM​
DT​
ANN​
LR​
KNN​
Accuracy​
Rules​
Accuracy​
Rules​
Accuracy​
Accuracy​
Accuracy​
Diabetes​
768​
8​
2
76.52%
16​
76.09%
42​
75.65%
76.52%
68.70%
Heart​
270​
13​
2​
75.31%
19​
64.20%
17​
72.84%
74.07%
51.85%
DNA​
3186​
60​
3​
91.98%
19​
90.04%
67​
87.09%
92.57%
40.38%

Related research paper:

Discover more about Rulex for life sciences & healthcare

Discover more about Rulex for life sciences & healthcare
]]>