Unpredictable and subjective behavior of machines is a good ingredient for a thriller, food for thought, but a bad ingredient for our peace of mind.
AI is potentially destructive, practically incomprehensible and mostly unintelligible.
Therefore, it is important to remove any mysticism surrounding ‘AI solutions’. This book explains that unmotivated decisions are not intelligent and, more importantly, not needed.
We need explainable AI. XAI.
Generating explanations to motivate decisions provides many business benefits. Get better results out of your AI investment by being aware, follow guidelines and understand the cases of XAI.
Explainable AI explained. XAIX.
There has never been a good reason for black box IT and there are ways to open black box AI solutions. Close the feedback loop between strategy and operations.
Silvie Spreeuwenberg has a masters in AI and holds an MBA. She graduated on generating explanations for a machine learned model. Clients ask her to optimize operational decisions and allign them with an organizations strategy or policy. She has worked in almost every vertical as consultant, trainer and entrepreneur.
As an expert in decision support systems development, I have been promoting transparency and self-explanatory systems to close the plan- do-check-act cycle.
All too often modern systems have similar issues, and face the same fate, than the legacy systems they replaced because the domain experts or end-users are not involved in the feedback loop.
My impression is that the journey is starting all over again as organizations start using AI technology as black box systems.
It is not needed, therefore read the book.