NIST unveils new open source platform for AI safety assessments

Mike Miliard August 7, 2024

The US National Institute of Standards and Technology (NIST) has created an open-source tool called Dioptra to aid in the testing of AI and machine learning models for safety and security. Dioptra helps users identify vulnerabilities in their AI systems that might reduce effectiveness, while also quantifying these effects. NIST says the tool can help organisations understand their AI software and assess its resilience to adversarial attacks.

Source: www.healthcareitnews.com - Read more