North Carolina AI Laws You Should Know (2026)

Artificial intelligence is moving from “interesting” to “operational” for North Carolina businesses fast. Between new statewide government guidance, an executive order focused on trustworthy AI, and multiple bills aimed at privacy and synthetic media harms, 2026 is shaping up to be a year where organizations need to treat AI like any other regulated business system: governed, documented, monitored, and secured.

Below is a roundup of the North Carolina AI-related laws and policy moves you should have on your radar in 2026, plus what to do about them.

Quick note: This article is for informational purposes only and is not legal advice. Talk with counsel for interpretations specific to your business and industry.

North Carolina AI Laws

1) Executive Order No. 24 (Sept. 2, 2025): Trustworthy AI for North Carolina government

On September 2, 2025, Governor Josh Stein signed Executive Order No. 24, focused on “advancing trustworthy artificial intelligence” across the state. While an executive order is not the same as a statute that directly regulates private companies, it still matters for two big reasons:

  1. It sets the direction for how state agencies will procure, deploy, and govern AI. If you sell to the State of North Carolina, partner with agencies, or support public sector clients, this influences contract language, security requirements, and acceptable AI use.
  2. It signals where legislation may go next. Executive orders often become “blueprints” for future policy.

The EO describes efforts like AI leadership and oversight structures and calls for governance and risk assessment approaches across state government.

What businesses should do in 2026:

  • If you work with state agencies (or plan to), be prepared for AI-specific contract requirements: data handling rules, security controls, transparency expectations, and documentation of AI use cases.
  • Treat AI tools as part of your vendor risk program, not “just another app.”

2) NCDIT’s Responsible Use of AI Framework: the baseline for state agencies (and a strong model for everyone)

North Carolina’s Department of Information Technology (NCDIT) published the North Carolina State Government Responsible Use of Artificial Intelligence Framework as guidance for state agencies to reduce privacy and data protection risks while using AI.

Even if you are not in government, this framework is useful because it reflects how a major public institution expects AI systems to be evaluated: principles, practices, and risk management. NCDIT also publishes “Principles for Responsible Use of AI” and related resources tied to privacy and assessment.

What businesses should do in 2026:

  • Build your own AI policy (acceptable use, data rules, human review requirements, and approval processes).
  • Require vendors to explain how their AI models handle sensitive data and how outputs can be audited.

3) North Carolina Personal Data Privacy Act proposal: a major privacy move that impacts AI

A big story for 2026 is privacy. House Bill 462 has been summarized as enacting a “NC Personal Data Privacy Act,” with an effective date described as January 1, 2026, creating a new chapter focused on data privacy concepts like “controllers” and “processo

Whether or not you’re “doing AI,” comprehensive privacy obligations matter because AI systems are often trained on, enriched by, or output insights from personal data. If privacy obligations expand, your AI roadmap has to expand with them.

Why this matters for AI:

  • AI tools frequently touch personal data (customer support transcripts, HR info, marketing audiences, CRM records).
  • Automated decisions can create risk around fairness, transparency, and consumer expectations, especially when decisions feel “black box.”

What businesses should do in 2026:

  • Inventory where personal data is used in AI workflows (chatbots, ticket routing, analytics, recruiting, marketing).
  • Create a rule: no sensitive data in AI prompts unless the tool is approved, contracted, and configured for it.

4) Deepfakes and synthetic media: proposed restrictions and new criminal offenses

North Carolina lawmakers have been active on synthetic media and deepfakes.

House Bill 375: “Artificial Intelligence and Synthetic Media Act” (proposed)

A legislative summary describes HB 375 as creating a new chapter covering AI and synthetic media, including election-related “materially deceptive media” definitions and protections aimed at misuse and deceptive content.

House Bill 934: “Unlawful distribution of a deepfake” (proposed)

A bill text for HB 934 includes a proposed criminal offense for the unlawful distribution of a deepfake and outlines conduct such as creating or distributing deepfakes to harass, extort, cause harm, or influence an election, with penalties described in the bill.

What businesses should do in 2026:

  • Train employees on deepfake risk: fake vendor payment requests, “CEO voice” phone calls, synthetic video in HR and recruiting scams.
  • Update security processes: require out-of-band verification for wire transfers, payroll changes, and password resets.
  • Establish a communications plan for synthetic media incidents (who investigates, who contacts platforms, who informs customers).

5) AI in education: ethics and literacy bills to watch

AI policy isn’t just about security. North Carolina’s legislature has also considered education-focused AI efforts, like Senate Bill 640, which is titled “AI Ethics and Literacy Across Education.”

For businesses, this matters because workforce expectations shift. Students entering the workforce may have new training around AI, ethics, and responsible use, and employers will be expected to match that maturity.

What businesses should do in 2026:

  • Expand security awareness training to include AI: phishing that uses perfect grammar, synthetic voice calls, fake “internal policy” documents generated by AI.
  • Add clear policies for AI use in customer communications, proposals, and marketing.

6) State AI programs and funding: signs of acceleration (not just regulation)

Several efforts point to North Carolina trying to accelerate AI adoption in a structured way:

Whether you see AI as opportunity or risk, the trend is clear: more adoption, more scrutiny, more governance.

7) Don’t forget the “already-here” laws AI can trigger: data breaches and consumer protection

Even without an “AI law,” AI can cause legal problems under existing rules.

NC Identity Theft Protection Act and breach notification

North Carolina’s Identity Theft Protection Act includes breach notification obligations. North Carolina DOJ guidance explains that businesses and state/local government must notify people when certain breaches involve personal identifying information.

AI tools can increase breach risk if sensitive data is pasted into chatbots or used in third-party AI platforms without proper controls.

What businesses should do in 2026:

  • Treat AI tools like any other system that touches confirmable sensitive data: access control, logging, retention policies, and vendor security review.
  • Build a “safe prompting” standard and enforce it with training and technical controls (DLP where possible).

A practical 2026 checklist for North Carolina organizations using AI

If you want a clean starting point, here’s a short checklist we recommend:

  1. AI Use Inventory: List where AI is used (internal and customer-facing).
  2. AI Policy: Define approved tools, prohibited data, and required human review.
  3. Vendor Risk Review: Contracts, data handling, retention, and auditability.
  4. Incident Readiness: Deepfake response, fraud verification steps, and breach procedures.
  5. Training: AI-enabled phishing, synthetic voice/video scams, and acceptable use.
  6. Security Hardening: MFA everywhere, least privilege, secure email, and financial controls for payment changes.

How PivIT Strategy helps

At PivIT Strategy, we approach AI like we approach cybersecurity: reduce risk without slowing the business down. That typically means helping clients set up AI usage policies, strengthen identity and email security (where deepfakes and AI-assisted phishing hit hardest), and build vendor review and incident response processes that match the real-world threats showing up in 2026.

Frequently Asked Questions: North Carolina AI Laws (2026)

Are there specific AI laws in North Carolina as of 2026?

North Carolina does not yet have a single, comprehensive AI statute that regulates all private-sector AI use. However, executive orders, proposed legislation, and existing privacy, consumer protection, and cybersecurity laws already impact how businesses can deploy AI systems in the state.

Does North Carolina’s AI executive order apply to private companies?

Executive Order No. 24 primarily applies to North Carolina state agencies. That said, private companies that sell to, partner with, or support state and local government entities may be required to follow AI governance, data handling, and security expectations outlined in state procurement and contract requirements.

How do North Carolina privacy laws affect AI tools?

Proposed privacy legislation in North Carolina focuses on how personal data is collected, processed, and protected. Since many AI tools process customer, employee, or consumer data, businesses must evaluate how AI systems use personal information and whether that use aligns with privacy obligations.

Can my business use AI tools like ChatGPT or Copilot in North Carolina?

Yes, businesses can use AI tools, but they should do so with clear internal policies. This includes rules around what data can be entered into AI systems, approval of AI vendors, and review of AI-generated outputs to reduce privacy, security, and compliance risks.

Read More:

South Carolina AI Laws (2026)

Mitch Wolverton

Mitch, Marketing Manager at PivIT Strategy, brings over many years of marketing and content creation experience to the company. He began his career as a content writer and strategist, honing his skills on some of the industry’s largest websites, before advancing to specialize in SEO and digital marketing at PivIT Strategy.