aicyberchallenge.com

AIxCC SEMIFINAL COMPETITION (ASC)

We’ve restructured the competition schedule to enable wide participation and ensure teams have the time to develop the best systems possible. The AIxCC Semifinal Competition (ASC) in August 2024 at DEF CON will be open to all registered teams.

The Qualification Competition and the Semifinal Competition have been combined into one event. This means that teams will have from our kickoff in March through July to develop the best possible Cyber Reasoning System (CRS) for finding and fixing software vulnerabilities at scale. For those five months, all registered teams will have access to technology from our AI collaborators to develop their CRS. Leading up to Semifinals, we’ll provide you with access to a competition environment to test out your CRS on exemplar challenges.

Teams can compete in the ASC either virtually or in person. The goal of the ASC is to create a fully autonomous CRS to find and fix vulnerabilities within the CPs, without human assistance. During the ASC, competitors will receive an identical corpus of Challenge Projects (CPs) modeled on real-world, open-source projects. The CPs will contain vulnerabilities that must be identified and secured.

Scoring Algorithm & Exemplar Challenge preview

During gameplay, each CRS will be given a suite of software projects, or challenge projects, and asked to automatically secure them. To ensure competitors develop a CRS that can successfully integrate in the real-world, we’ve developed a scoring system based on four key metrics to fairly assess each competitor’s system;

  • Diversity Multiplier
  • Accuracy Multiplier
  • Vulnerability Discovery Score
  • Program Repair Score
 
The AIxCC CPs are based on real, open-source software. And we’ll be working closely with the Open Source Security Foundation to identify the most critical open-source software and design challenge-sets that are inspired by real world CVEs. For the AIxCC Semifinal Competition, CPs will be primarily written in C, C++, and Java, but may be written in a range of languages. We encourage you to think creatively about how a CRS can leverage AI technology to handle this diversity. 

Collaborator Resources

To help competitors build CRS, DARPA’s collaborators at Anthropic, Google, Microsoft, and OpenAI will make credits available for their large language models and computing resources to those who meet eligibility and application requirements. We will share more specifics in 2024!

Request for Comments

Thank you for sharing your feedback with AIxCC. We are no longer accepting comments as of Jan. 15, 2024 (23:59 Anywhere on Earth or AoE UTC – 12:00).

Establishing a fair and effective scoring system is critical to AIxCC’s success, and we look forward to incorporating your feedback in the updated Scoring Algorithm and Exemplar Challenge releases.

Download the Request for Comments (RFC).