Mike Vizard examines the results of a Black Duck Software survey revealing the rapid integration of AI coding tools into embedded systems development. He discusses the resulting security, license compliance, and governance challenges now facing organizations.

Survey Surfaces Raft of AI Coding Issues Involving Embedded Systems

Author: Mike Vizard

A Censuswide survey conducted for Black Duck Software explored how AI technologies are transforming embedded systems development, highlighting both rapid adoption and new risks.

Key Findings

  • Widespread AI Use: 89% of respondents’ organizations use AI coding assistants for embedded system development, reflecting rapid industry-wide adoption.
  • Open Source AI Models: 96% of professionals surveyed are integrating open source AI models into products, increasing complexity in license management.
  • Security Concerns: 21% lack confidence in preventing AI-generated code from introducing vulnerabilities, and 18% are unsure they can manage open source license risks.
  • Governance Lag: While productivity gains are clear, many organizations have not yet implemented governance and security protocols to match the pace of AI adoption.

Corey Hamilton (Black Duck Software) noted that AI coding tools offer productivity but also elevate potential risks, especially in safety-critical embedded applications.

Programming Landscape Shifts

  • Python Leads: Python is now the top language for embedded system development (27%), with C++ (26%), Java (22%), and JavaScript (21%) following.
  • SCA and SBOM Adoption: Use of Software Composition Analysis (SCA) tools is increasing—39% of organizations scan with every build and pull request, and 35% within IDEs.
  • SBOM Requirements: 71% can now produce a Software Bill of Materials (SBOM), often driven by partner or customer mandates.

Security, Compliance, and DevOps Impacts

  • License Compliance: More than half of companies actively scan for compliance in main components and copy-pasted code snippets.
  • Management vs. Developer Views: There is a disconnect between leadership and hands-on engineers on project success, with CTOs (86%) more confident than developers (56%).
  • Legacy Risks: Embedded systems are an attractive attack surface as many older platforms still lack sufficient cybersecurity.
  • Need for Best Practices: The industry trend is toward replacing outdated systems and adopting DevSecOps methodologies, but concerns remain that AI-generated code could introduce—or help mitigate—new vulnerabilities.

Conclusion

As embedded systems proliferate and AI coding assistants become mainstream, organizations must improve security governance, keep pace with compliance requirements, and close the gap between perceived and actual software quality. Effective DevSecOps and robust license management processes are crucial to addressing the new risks posed by rapid AI integration.

Read the original survey or related DevOps coverage for more details.

This post appeared first on “DevOps Blog”. Read the entire article here