Technology Law Insights, V 6, Issue 8, 2025

 

 [co-author: Addison Gills]*

Volume 6, Issue 8

Welcome

Welcome to our eighth issue of 2025 of Decoded - our technology law insights e-newsletter.

We hope you enjoy this issue and thank you for reading.


Cybersecurity Ranks Among Top Three Risks to Manufacturing Sector

“Most companies are planning major AI investments to address growing threats to OT systems.”

Why this is important: With manufacturing returning to the U.S., the manufacturing sector is looking at the risks involved with making products here in America. One of the leading risks behind inflation and economic growth is cybersecurity for their operational technology. With manufacturing becoming more automated and connected, manufacturers’ operating technologies, e.g., robotic welders, automated assembly lines, and control and command systems, are vulnerable to attack by bad actors. In fact, the manufacturing sector experienced the most ransomware attacks in the second quarter of 2025. In response to these risks, manufacturers are taking cybersecurity more seriously, and they are prioritizing it. This includes the adoption of industry standards and frameworks to protect their operational technology, including the NIST SP 800-82 Guide to Industrial Control Systems Security, the NIST IR 8183 Cybersecurity Framework Manufacturing Profile, the CISA Recommended Cybersecurity Practices for Industrial Control Systems, and the ISA/IEC 62443 standards. These governmental and industry standards and frameworks, along with implementation of common sense and AI cybersecurity tools, will assist the manufacturing sector’s resurgence in the U.S. --- Alexander L. Turner


Financial Impact from Severe OT Events could Top $300B

“A report from industrial cybersecurity firm Dragos highlights growing risks of business interruption and supply-chain disruptions.”

Why this is important: Information Technology (IT) does and should always garner attention and resources to ensure protection, maintenance, and versatility in the modern era. Operational Technology (OT), which covers physical processes often in critical infrastructure, is increasingly coming under attack by cybercriminals seeking to disrupt and hold hostage as many systems as they can. Cyberattacks, whether employing ransomware, malware, or corrupting and disrupting operational technology, are likely to cost companies billions of dollars in annual aggregated fees for ransom payments, insurance claims, and other forms of increased protection. Cybersecurity firms are using probability statistics and a decade's worth of cyber breach and insurance claim data to estimate that business interruptions to operational technology could reach a global aggregated total of 31 billion dollars (USD) over the next 12 months. Attacks have successfully targeted supply chain operations and online transaction capabilities. “The three OT security controls most associated with risk reduction were maintaining a comprehensive incident-response plan, using defensible architecture and performing continuous monitoring to preserve visibility into a network.” Preparation is key. Run audits and update systems regularly. Thinking that a hack has not occurred because all systems are running is not a safety plan. Having appropriate protocols in place in the event of a breach from staff to third-party vendors can make the difference between a few hours and dollars or months and millions worth of downtime. --- Sophia L. Hines


Could Water-Free Data Centers Move from Concept to Reality?

“Microsoft plans to pilot water-free facilities by 2026, but can the industry truly eliminate cooling water without sacrificing energy efficiency?”

Why this is important: This article examines water-free data centers as an emerging solution to address water scarcity concerns in the industry. Traditional data centers consume substantial amounts of water for cooling through evaporation, averaging 1.8 liters per kilowatt hour. As water resources become increasingly strained, some operators are moving beyond efficiency improvements to pursue complete water elimination from their cooling processes.

Microsoft leads this initiative with plans to pilot water-free data centers starting in 2026, while companies like NOVVA and Vertiv have developed similar concepts. The technical approaches include liquid immersion cooling using non-conductive fluids, which is highly efficient but expensive, and mechanical cooling systems that are more affordable but consume significantly more electricity.

This creates a trade-off between water and energy sustainability. Operators with renewable energy access may find the increased electrical consumption acceptable to achieve zero water usage. Enhanced designs can improve mechanical cooling efficiency through sealed server cabinets or hybrid direct-to-chip systems using mechanically chilled fluids.

While the technology exists, widespread adoption remains unlikely due to cost and complexity. Water-free data centers will likely remain specialized applications, with success depending on a careful balance between water elimination, energy efficiency, and economic viability. --- Shane P. Riley


Research Suggests Doctors Might Quickly Become Dependent on AI

“The study looking at gastroenterologists in Poland found that they appeared to be about 20% worse at spotting polyps and other abnormalities during colonoscopies on their own, after they'd grown accustomed to using an AI-assisted system.”

Why this is important: AI's increasing use in medicine, particularly in diagnostics, presents both groundbreaking opportunities and significant legal risks. A recent study published in Lancet Gastroenterology and Hepatology highlights one such risk: the potential for over-reliance on technology. Artificial intelligence is being used to help doctors screen patients, but a new study in Poland suggests that doctors may become overly reliant on the technology, reducing their ability to detect possible polyps with the naked eye by 20 percent. The study suggests a "safety-net effect," where doctors may subconsciously rely on the AI's prompts rather than years of training and decades of experience. This raises questions of the use of AI and whether missing a diagnosis due to AI, or a lack thereof, could increase legal exposure.

The increasing integration of AI in medical practice presents a number of potential legal issues for healthcare providers and technology companies. Our law firm can assist in developing risk management strategies, creating clear legal frameworks for AI use, and providing counsel on regulatory compliance to help businesses proactively plan for the future. --- James T. Taylor


Santander Charts "Data and AI-First" Future with New OpenAI Partnership

“In the first two months of the partnership, over 15,000 Santander employees have been granted access to OpenAI's ChatGPT Enterprise.”

Why this is important: The Spanish banking giant Santander has been leveraging AI to help streamline operations and save Euros. Recently, it partnered with OpenAI to give 15,000 employees access to ChatGPT. That number will rise to 30,000, about 15 percent of its global workforce, by year's end. This move is part of its plan to become an "AI-native" bank where "every decision, process and interaction is powered by data and intelligent technology." In the near future, its plan for AI includes "scaling agentic AI, transforming front- and back-office processes and enabling fully conversational banking," with AI copilots evolving into "decision-making partners" while virtual assistants will handle customer transactions. The article discusses the impact on Santander’s bottom line. It estimates that last year alone it was able to save over €200 million by with its AI initiatives. --- Nicholas P. Mooney II


Tract Pulls Plans for North Carolina Data Center on Land Formerly Owned by NASCAR Legend Dale Earnhardt

“Scheme dubbed the ‘Concrete Monster’ has been canned.”

Why this is important: With the rise of AI technology comes the increase in the construction of data centers throughout the country. In response to the construction of the data centers, reactions have been mixed from local communities. Many community members object to the construction of these data centers because they are an eyesore, result in an increase in electricity rates, take over vast swaths of viable farmland, and use a lot of water to cool the large amount of servers. Recently, Tract withdrew a plan to build a data center on a 400-acre site 30 miles north of Charlotte. The decision to withdraw the plan was based on local opposition to the construction of the data center complex. --- Alexander L. Turner


Pennsylvania Government Agencies Aim to Expand Employees’ Use of Artificial Intelligence

“As Pennsylvania leaders aspire to set the state up as an AI and data center hub, the commonwealth is one of the first states to examine generative AI usage across the state government.”

Why this is important: Pennsylvania is leading efforts to expand artificial intelligence tools throughout state and local government agencies following a successful pilot program that demonstrated significant efficiency gains. The state launched a year-long generative AI pilot program in January with 175 employees across 14 agencies using ChatGPT Enterprise for tasks like writing assistance, research, brainstorming, and document summarization. The program cost $108,000 and yielded impressive results -- participants saved an average of 95 minutes per day and reported positive experiences overall.

Based on the pilot's success, Pennsylvania is now exploring ways to expand AI access to more state employees. The state requires training on "safe and responsible AI use" before employees can access these tools. Key agencies showing interest include: the Department of Human Services, the Department of Environmental Protection, and the Housing Authority of Pittsburgh.

Pennsylvania has established comprehensive oversight through Gov. Josh Shapiro's 2023 executive order, which created a Generative AI Governing Board and standards for government use. Current policies prohibit using AI for employee decisions, require human verification of AI-generated content, and ban inputting private data into AI tools.

Despite the enthusiasm, experts caution about AI's tendency to "hallucinate" or create false information. Carnegie Mellon's Cole Gessner advises treating AI "like a summer intern" that requires double-checking. Most local governments still lack specific AI policies, though Allegheny County is developing guidelines. --- Shane P. Riley


America's Power Grid is No Match for the AI Data Center Boom

“Data center demand for computing facilities that can consume as much power as entire cities, but America's electrical grid is struggling to keep pace.”

Why this is important: Artificial Intelligence (AI) is not new, but it has certainly taken center stage in national energy production and development discussions. It should come as no surprise that the United States' energy grid is old and in need of serious renovation. In addition to current development and consumer needs, AI centers are requesting and requiring a range of multiple gigawatts of power. For reference, current AI centers are requesting the amount of energy equivalent to all of the power in the state of New York. Some analysts estimate that trillions of dollars are likely to be invested between now and 2030, with as much as three trillion by 2028. One roadblock, other than the serious environmental impact and pollution surely to result from this AI surge, is the access to fundamental components such as transformers. With supply chain delays, resource market fluctuation, and the ever-increasing competition from multiple market sources, hotels, professional service firms, and schools are all developing their own AI models; the market cannot sustain the growth indefinitely. The increased demand on the grid is also creating further delays in the development of other legitimate projects because of the backlog of planned but possibly hypothetical AI data projects, not certain to maintain funding or obtain the necessary land space. Acres of space, gigawatts of power, and immense environmental consequences are the holy trinity of the AI frenzy. How the grid will sustain itself and expand to meet demands is yet to be seen. Coupled with unprecedented federal deregulation, the AI data center gold rush is well underway. --- Sophia L. Hines


Risk Management, Legacy Tech Pose Major Threats to Healthcare Firms, Report Finds

“Companies have improved their recovery processes and user controls but still lag in risk preparedness, according to the report.”

Why this is important: On average, healthcare companies receive and maintain more sensitive information than other industries, making the healthcare sector a big target for cyberattacks. In 2024, 92 percent of healthcare organizations reported cyberattacks, and nearly 70 percent saw patient care impacted. Common areas of security vulnerabilities include: securing old systems, recovery process improvements, response planning, post-incident communications, and threat analysis maturity. Focused improvements in these areas are essential for protection against cyberattacks and compliance with upcoming potential federal regulations.

On January 6, 2025, the U.S. Department of Health and Human Services (HHS) published a notice of proposed rulemaking in the Federal Register detailing proposed changes to the HIPAA Security Rule. If implemented, it will be the first major update to the Security Rule in two decades. The comment period closed on March 7, 2025, and the process of reviewing the comments began thereafter. The proposed rule aims to improve cybersecurity and better protect the U.S. health care system from a growing number of cyberattacks. The proposed rule would, among other things, mandate specific risk analyses and the use of multi-factor authentication. --- Joseph C. Unger


Cybersecurity in Higher Education: A Critical Investment Opportunity in an Era of Rising Threats

“Universities manage hundreds of domains, many outdated or unmaintained, creating entry points for hackers.”

Why this is important: Higher education institutions continue to be prime targets of cyberattack groups. Columbia University was a recent victim of a cyberattack that resulted in a class action lawsuit and reputational harm. One reason that educational institutions are targets of cyberattacks is due to the volume of valuable data they maintain. Student information, research intellectual property, and financial records are all valuable for a cyber attacker to possess. Other reasons cyber attackers focus on educational institutions include schools’ use of outdated technology and the complexity universities face in managing many different domains, which create points of weakness in data protection. The vast amount of data and domains give cyber attackers multiple entry points for access.

The increase in cyberattacks, especially those against higher education institutions, has created opportunities in the cybersecurity market. As this article highlights, in 2024 alone, ransomware attacks against educational organizations increased by 75 percent year-over-year, and these breaches cost organizations dearly, with average costs reaching $2.8 million (including administrative downtime and ransom payments) in 2024. As a result, the following segments of the cybersecurity market have caught investors’ attention: endpoint security, identity and access management, encryption and data protection, and AI-powered solutions. There are plenty of tech-based companies working on these offerings. AI technology can monitor organizational data and detect attacks in an expedited manner; meanwhile, companies that identify weak spots and vulnerabilities in organizational security are also experiencing growth.

Universities must invest in their own cybersecurity and take the steps necessary to avoid becoming part of these statistics. A critical problem some educational organizations face is a lack of funds to invest in these protections due to budget constraints, and in response, some states like New York and California have allocated money to schools for upgrades. Where budget constraints exist, institutions should nevertheless engage in prevention and incident response planning to the extent of their capabilities. For further assistance, please contact your Spilman counsel and review this article on best practices. --- Nicholas A. Muto


‘It’s Just Bots Talking to Bots’: AI is Running Rampant on College Campuses as Students and Professors Alike Lean on the Tech

“AI use is continuing to cause trouble on college campuses, but this time it’s professors who are in the firing line.”

Why this is important: AI is increasingly reshaping higher education in profound ways, as seen in this recent Fortune article highlighting how professors, not just students, are immersed in generative AI platforms like ChatGPT. Faculty report that using AI for lesson preparations and grading has become pervasive, shifting expectations and workflows across campuses. Meanwhile, institutions are moving away from bans toward integrating AI into systems, such as the OpenAI partnership with Canvas LMS, which supports assignment design, personalized learning experiences, and streamlined administrative tasks while keeping educators in control.

This rapid evolution in higher education signals new opportunities and risks for ed‑tech developers, infrastructure planners, and compliance teams. Expect the industry to experience expanding demand for AI-driven educational platforms, digital compliance frameworks, and assistive tools that support accessibility. At the same time, rising concerns about academic integrity, data privacy, and equity of access have led universities to reconsider assessment models, like oral exams and process‑oriented assignments, to maintain critical thinking outcomes. Industry stakeholders should be prepared to advise on policy development, licensing, and AI governance, as institutions seek partners who can help them navigate the ethical, technological, and legal dimensions of AI integration in education.

*Summer Associate

 

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Spilman Thomas & Battle, PLLC

Written by:

Spilman Thomas & Battle, PLLC
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Spilman Thomas & Battle, PLLC on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide