Facebook releases Captum, a tool it says will help explain the decisions made by its machine learning frameworks

Neualink's surgical robot, designed to implant a chip behind the user's ear. (Neuralink)

In an effort to offer insight into how neural networks perform complicated tasks -- be it identifying objects in images or building forecasts from financial records -- Facebook has open sourced a tool that pulls data from AIs in the middle of their work.

The tool, called Captum, is the latest in a series of recent pushes by AI developers to offer transparency into the often described "blackbox" of machine learning.

Captum works with PyTorch, Facebook's machine learning framework.

While there is certainly a call from privacy and technology observers to increase transparency in AI, Facebook has positioned Captum as a tool for developers to improve their systems.

From Facebook's blog post:

Model interpretability libraries such as Captum help engineers create more reliable, predictable, and better-performing AI systems. They can inform decision-making about how those systems are used and build trust with others. In addition, as the number of multimodal models increases, the ability for interpretability libraries and visualizations to work seamlessly across such modalities will be crucial.

Related people

Sections