TLDR: A recent ISACA survey reveals that Australian technology executives are deeply concerned about the risks associated with generative AI, including AI-driven cyber threats and deepfakes. A significant majority (67%) anticipate these threats will cause sleepless nights in 2026, yet only 8% feel adequately prepared to manage them. The report highlights increasing regulatory complexity, ransomware, and supply chain vulnerabilities as additional major concerns for tech leaders.
Australian technology leaders are increasingly anxious about the rapid rise of generative artificial intelligence (genAI) and its associated risks, according to a new survey by the cybersecurity body ISACA. The ‘2026 Tech Trends & Priorities Pulse Poll,’ which surveyed nearly 3,000 global security professionals, found that 64% of Oceania respondents identify genAI and large language models as key drivers for the 2026 agenda. This positions genAI ahead of other significant trends like AI and machine learning (60%), data privacy and sovereignty (34%), and supply chain risk (34%).
Despite its transformative potential, genAI is a major source of worry for risk and security professionals. A striking 67% of respondents expressed concerns that AI-driven cyber threats and deepfakes will be a significant challenge in 2026. Alarmingly, only 8% reported feeling very prepared to manage these emerging risks.
The survey also highlighted other pressing concerns for Australian tech executives. Approximately 45% are most worried about the ‘irreparable harm’ that could result from failing to detect or respond to a major breach. Supply chain vulnerabilities, exemplified by incidents affecting companies like Qantas, Dymocks, and British Airways, are a concern for 41% of respondents. Technical issues such as cloud misconfigurations and shadow IT were cited by 38%, while 36% are troubled by increasing regulatory complexity.
In response to these challenges, tech leaders are prioritizing several key areas. Regulatory compliance was named a top focus by 58% of respondents, followed by business continuity and resilience (52%), and cloud migration and security (48%). Three-quarters of those surveyed believe that enhanced cyber regulations will ultimately boost digital trust.
Also Read:
- AI-Powered Online Threats Escalate: Activist Reveals Terrifyingly Realistic Attacks
- Beyond the Hype: Unpacking the Overlooked Aspects of AI Accountability and Governance
Jamie Norton, ISACA Board vice chair, summarized the current landscape: ‘Security leaders are dealing with constant AI-driven threats, tighter regulation and growing expectations from executives, all while struggling to find and keep the right people.’ He added, ‘It’s a perfect storm that demands stronger leadership focus on capability, wellbeing and risk management.’ The findings corroborate other recent surveys, such as Proofpoint’s 2025 Voice of the CISO survey, which found that 76% of Australian CISOs had experienced material loss of sensitive information in the past year.


