aicyberchallenge.com

AIxCC Semifinal Competition (ASC)

As threat actors continue to exploit software vulnerabilities with increasingly sophisticated attacks, the urgency around automated and scalable vulnerability discovery and remediation has also escalated. To address this pressing need, the Artificial Intelligence Cyber Challenge (AIxCC) incentivizes the best and brightest engineers and program analysts to develop real-world solutions to these cybersecurity threats.

Competition. Leading up to the ASC, competitors aimed to develop Cyber Reasoning Systems (CRSs)–tools designed to automatically find and fix vulnerabilities at scale using Large Language Models. AIxCC received nearly 40 CRS submissions and tested each against an identical corpus of Challenge Projects that had a basis in a real-world, open-source project: Jenkins, Linux kernel, Nginx, SQLite3, and Apache Tika.

Experience. In parallel, the AIxCC organizers constructed a futuristic, fictional city that brought the AIxCC Semifinal Competition to life—the City of Northbridge. The City of Northbridge made its debut at the AIxCC Village at DEF CON 32, in Las Vegas, Nevada. DEF CON attendees, ranging from software developers to policymakers, boarded the train to Northbridge as their host, Kiti, called upon the Cyber Collective to remain vigilant amidst cyberattacks. For three days, Kiti guided attendees through simulated cyber-attacks on Northbridge’s critical infrastructure, enlisting help from participants in manual data collection as cybercriminal “the Rat” targeted hospitals and water systems. Attendees engaged with representatives from the team of AIxCC organizers and collaborators, while Kiti revealed the vulnerability discovery and patching capabilities of the Semifinal teams’ CRSs. The urgency and relevance of competitor CRSs became clearer as these systems defended the city’s infrastructure against a dynamic threat landscape.

Click on the buttons below to learn more the ASC

AIxCC SEMIFINAL COMPETITION (ASC)

After examining public feedback collected over the past few months, the AIxCC Organizers have made updates to the AIxCC Semifinals Competition (ASC). These revisions fulfill multiple objectives, all converging toward one goal: facilitating a fair, successful, and impactful ASC. The ASC Procedures and Scoring Guide below summarizes these changes and will serve as the authoritative source of ASC format and procedures. The Guide below does not supersede the AIxCC Rules, which can be found on the AIxCC website: https://aicyberchallenge.com/rules/.

Additionally, the AIxCC Linux Exemplar Challenge Project (CP), which is structured based on the current CP specification, is publicly available at https://github.com/aixcc-public/challenge-001-exemplar.

Additional resources are available on the AIxCC Competitor Dashboard and are limited to verified AIxCC competitors.

Click here to reference the archived AIxCC Request for Commentswhich was live from December 13, 2023 through Jan. 15, 2024.

Scoring Algorithm & Exemplar Challenge preview (released dec 13, 2023)

Please note: The ASC Procedures and Scoring Guide is the authoritative documentation. Please refer to that document for the latest guidance.

SMALL BUSINESS TRACK WINNERS

42-b3yond-6ug
LACROSSE
PANACEA
Shellphish
Trail of Bits
VERSATIL
Zellic

FINALIST TEAMS

42-b3yond-6ug
all_you_need_is_a_fuzzing_brain
LACROSSE
Shellphish
Team Atlanta
Theori
Trail of Bits

SEMIFINAL TEAM LIST

42-b3yond-6ug
AI Guard
Aitomatic_DAIC_Collaboration
Alien Illuminati R Us
all_you_need_is_a_fuzzing_brain
Autonomous Cyber
BlueWaterWhiteSmoke
Buffalo
CASAndra
CipherTen
Composure
Debug Dragons
DSU Trojans
Fire Eye

FoodieBoys
Fuzzland
HammerAI
Healing Touch
illinoiSE
IntelliAgents
Interceptive
Kalibr Research
KORIA
Kri Labs
LACROSSE
Lauretta Labs
LoneStarFuzz
MADBUGS

Mindrake
Mistborn_417
PANACEA
PicoloLabs
SashiKode
Shellphish
SODIUM-24 AI
Team Atlanta
Theori
Trail of Bits
VERSATIL
WASPS
wpAppDev
Zellic

VISUALIZING AIxCC: BRINGING YOUR CODE TO LIFE

The AIxCC challenge projects (CPs) and challenge project vulnerabilities (CPVs) vary in size, scale, and complexity. To communicate the game details to a variety of audience—including competitors, hackers, policymakers, and students—the AIxCC Organizers developed code visualization videos to illustrate the scope of the code involved, the impact of vulnerability exposures, and the difficulty of the tasks faced by competitors.

To learn more, take a look at the playlist below and pay specific attention to the color-coded functions highlighted across each CP and CPV: Green connections show the full extent of the code in a particular CP; yellow focuses on code reachable from an interface exposed to competitors; and red narrows down to the path where a CPV is triggered, showing detailed diagnostic information typically used to exploit or patch such vulnerabilities.

DEF CON 32: THE AIxCC EXPERIENCE

IN THE STUDIO TALKS

Sets the stage for the Artificial Intelligence Cyber Challenge (AIxCC): from its origins to its strategic vision, our stakeholders cut across critical industries and understand the urgency behind AIxCC’s ambitious goals.

AIxCC EDUCATION

Dives into the technical specifications of the competition, to include the various tools that can support vulnerability discovery and remediation as well as the Challenge Projects (CPs) that measured the performance of competitor submissions.