When software decides who gets a loan, a job interview, or medical attention, performance isn’t enough—principle matters. This book is a practical blueprint for building systems that are useful, lawful, and worthy of trust. Moving beyond slogans about ethical ai, it shows product teams how to translate values into requirements, harms into tests, and judgement into accountable workflows. You’ll learn how to diagnose algorithmic bias at its source, design for fairness in ai without tanking utility, and implement explainable decisions that actually change outcomes—not just narratives. Inside, you’ll find a compact ai ethics framework that fuses philosophy with engineering: value specifications, harm tables, appeal flows, and audit trails you can put to work on day one. Clear guidance on ai governance turns compliance from a drag into an advantage, while patterns for human in the loop ensure people are empowered to override, not scapegoated for mistakes. Real-world case studies—from lending to content ranking—illustrate how accountable algorithms balance accuracy with dignity, and where responsible machine learning requires the most restraint: knowing when not to predict. Written for builders and leaders—engineers, data scientists, PMs, policy leads, founders—this is a field guide to data ethics you can ship. If you’ve ever been asked to “add fairness later,” this book gives you the language, tools, and decision paths to build it in from the start.
Ethics Engine
SKU: 9789374126653
$31.99 Regular Price
$23.01Sale Price
- Levent Karaman writes at the seam where code meets conscience. Raised between workshop pragmatism and library philosophy, he builds systems that work in the world without forgetting the people in it. His work asks unfashionable questions—who is left out of the dataset, what duty outlives a deadline, where does a model’s confidence exceed its right to decide—and answers them with artefacts you can ship: value specifications, harm tables, audit trails. He draws on a European inheritance of public reasoning, from Spinoza’s realism about power to Mary Midgley’s distrust of single big ideas, and pairs it with the builder’s instinct to make ethics testable. Off the page he mentors product teams and community reviewers, believing good governance is a civic craft, not a compliance ritual. He writes to give practitioners language, tools, and courage enough to choose the slower good.


















