No menu items!
No menu items!
More

    NIST Secure Software Development Framework for Generative AI and for Dual Use Foundation Models Virtual Workshop

    SSDF for Generative AI & for Dual Use Foundation Models - calendar with January 17th circled

    NIST is hosting a workshop on Wednesday, January 17, 2024, from 9:00 AM – 1:00 PM EST to bring together industry, academia, and government to discuss secure software development practices for AI models. Attendees will gain insight into major cybersecurity challenges specific to developing and using AI models—as well as recommended practices for addressing those challenges. Feedback from various communities will inform NIST’s creation of SSDF companion resources to support both AI model producers and the organizations which are adopting and incorporating those AI models within their own software and services.

    Background

    The October 2023, Executive Order 14110, Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, tasked NIST with “developing a companion resource to the SSDF to incorporate secure development practices for generative AI and for dual-use foundation models.” NIST’s SSDF version 1.1 describes a set of fundamental, sound practices for general secure software development. The SSDF focuses on outcomes, not tools and techniques, so it can be used for any type of software development, including AI models.

    To provide software producers and acquirers with more information on secure development for AI models, NIST is considering the development of one or more SSDF companion resources on generative AI models and dual-use foundation models. These companion resources would be similar in concept and content to the Profiles for the NIST Cybersecurity Framework, Privacy Framework, and AI Risk Management Framework.

    During the workshop, NIST is seeking feedback on several topics to help inform the development of future SSDF Profiles, including:

    1. What changes, if any, need to be made to SSDF version 1.1 to accommodate secure development practices for generative AI and dual-use foundation models?
    2. What AI-specific considerations should NIST capture in its companion resource?
    3. What else should be captured in the SSDF Profiles?
    4. Is there an alternative to an SSDF Profile that would be more effective at accomplishing the EO 14110 requirement, while also providing flexibility and technology neutrality for software producers?
    5. What secure development resources specific to AI models do you find most valuable?
    6. What is unique about developing code for generative AI and dual-use foundation models?

    Questions about the workshop or NIST’s SSDF work? Contact us via ssdf [at] nist.gov (ssdf[at]nist[dot]gov).

     

    https://www.nist.gov/news-events/events/nist-secure-software-development-framework-generative-ai-and-dual-use-foundation

    Latest articles

    spot_imgspot_img

    Related articles