Date Published: 6 May 2026
Article by: Taylor Hosken, Senior Cybersecurity Consultant
What is Privacy Awareness Week 2026? - OAIC PAW 2026 Australia
Privacy Awareness Week ("PAW") is an annual initiative dedicated to promoting awareness of privacy rights and responsibilities, and the importance of safeguarding personal information. As a GRC consultant and data privacy professional, it stands among the most significant cybersecurity awareness initiatives on the calendar. Privacy remains a critically important and evolving topic.
Australians generally demonstrate a sound understanding of what constitutes private information and the protections it warrants, alongside a strong foundation of practical common sense when it comes to the steps required to protect that information. This is reflected in the widespread familiarity with foundational best practices, such as refraining from sharing passwords, avoiding password reuse, and exercising discretion when uploading personal information to social media platforms.
Given this baseline awareness, one might reasonably ask: why do we continue to invest in dedicated privacy awareness campaigns, such as the OAIC's PAW 2026?
PAW 2026 Theme Explained: What "Trust" Means for Australian Privacy Compliance
The theme of PAW 2026 is Trust. “Trust is built here. In every privacy complaint. In every resolution”. Privacy awareness campaigns remain among the most critical cybersecurity initiatives precisely because individual proficiency in protecting personal information, however diligent, is only as effective as the weakest link in the broader data ecosystem.
A person may maintain robust password hygiene, minimise their digital footprint, or take deliberate steps to limit their online presence entirely. Yet these efforts can be wholly undermined the moment an organisation that holds their information, whether knowingly or otherwise, suffers a data breach.
It is for this reason that the handling of personal information must be governed by consistent, up-to-date procedures and standards. Only through a cohesive and well-enforced approach to data privacy can Australia meaningfully build public confidence and support businesses in sustaining the customer trust upon which they depend.
IAPP PAW 2026 Keynote Recap: Federal Privacy Commissioner Carly Kind on Trust and Complaint Handling
PAW 2026 was officially opened by the International Association of Privacy Professionals (IAPP) with a keynote address delivered by Carly Kind, Federal Privacy Commissioner at the Office of the Australian Information Commissioner (OAIC),centred on the themes of trust, complaint handling, and Australian privacy protection initiatives. The presentation was both compelling and thought-provoking. Commissioner Kind emphasised that building genuine, lasting trust requires the translation of privacy concerns into tangible, actionable outcomes – a principle as relevant to cybersecurity practice as it is to legal and regulatory frameworks.
While the address was tailored primarily to a legal audience, reflecting on its key messages through a cybersecurity lens surfaces a number of noteworthy considerations and areas warranting closer attention.
How to Demonstrate APP 1 Compliance: Investment, Transparency and the Bunnings Facial Recognition Case
Carly kicked off by identifying that actions must occur before trust can be built into Australia’s approach to privacy. Actions correspond with defining goals, which can be further broken down into the need for investment (both financial and cultural), transparent practices (referring here to everyone’s favourite Bunnings Facial Recognition case), and most importantly, substantive practices. Recent disciplinary actions have confirmed that to demonstrate compliance with Australian Privacy Principle (APP) 1 “Open and Transparent Management of Personal Information”, ad hoc practices are deficient.
Investment, transparency and documentation are the bread and butter of a cyber governance, risk and compliance engagement. This is made even more important by the law currently placing the obligation on entities to be proactive and demonstrate their work ‘upfront’. Consultants are constantly balancing financial restraints and resource management with investment and compliance – identifying cost efficient pathways to lead to effective risk management. Oftentimes this is undertaken by the need to shift cultural perspectives on cybersecurity. Culture and compliance often mirror each other, and impact an organisation’s security risk posture.
Carly used the Bunnings case as a clear example of how not to conduct a transparent practice. From a cybersecurity perspective, her case analysis indicated an absence of a clear, foundational component of GRC: procedural documentation and effective risk management.Organisations and security professionals alike can learn from Bunnings and the OAIC’s interpretation of privacy legislation to understand the importance of creating a strong governance and risk management baseline.
APP 3 and the Fair and Reasonable Test: What Australian Organisations Need to Know in 2026
Carly’s second component of building trust concerns agency; the power should be rebalanced so the individual, rather than the organisation, has the power to determine how their data is used. This directly embodies APP 3 “Collection of Solicited Personal Information”, and has confirmed the long standing Fair and Reasonable legal test when determining whether online choice architecture is lawful. APP3 has been heavily debated in both the legal and cybersecurity professional circles, as both services are responsible for providing advice on appropriate controls and procedures organisations ought to implement avoid unnecessary data collection and retention. Clarity on the technology parameters of this Principle have been heavily sought after.
Consultants must also be mindful of the Fair andReasonable test for APP3 when engaging with a client through a privacy lens. Website designs, public notifications, communication channels and privacy documentation uplifts are all tasks which can be undertaken by security consultants to strengthen and align their customers to a desired framework or industry benchmark. However, if these ‘upgrades’ compromise the design and clarity of their online choice architecture, then the client may become liable to a privacy complaint.
AI, Data Privacy and the OAIC's 2024 Guidelines: What Cybersecurity Consultants Must Know
Of particular interest within this section was Carly’s discussion of AI model training and AI scribes within sensitive industries. TheOAIC published a guideline on AI models in 2024, which is one of the mostup-to-date and endorsed legal guidance on AI and data privacy for consultantsand organisations to be aware of.
The guide reflects the requirements for transparency, suggesting developers to update their privacy policies and notifications to reflect information of their AI use and model training. Developers can only collect personal information that is reasonably necessary for their functions or activities, and all health-sensitive or personal information mush ave clear consent.
Data Retention, AI Model Training and the Deletion Dilemma: Emerging Privacy Risks in Australia
As already discussed, consent is partially determined by the organisation’s online choice architecture. It is also judged based on the capacity, understanding and expected use of the data.
This notion of reasonable expectations is linked with data retention concerns, where often organisations have been found to have unreasonably retained personal information for longer than necessary. Data retention and deletion practices must be clearly defined and understood to protect an organisation from unnecessarily keeping sensitive information, or worse, incidentally using retained information to train their AI models.
Data mapping exercises are a common and effective way for an organisation to learn where data is transferred and stored, so that all components of the information can be deleted when required. What is interesting, and case law is yet to provide guidance on, is the technology component when it comes to data mapping and deletion. Must a risk assessment have to occur before data can be deleted, or should data be automatically deleted after a stipulated time period? If a risk assessment is required, must it occur manually by a human analyst, or can it be done more efficiently through a program or AI tool? Will the AI tool then retain and potentially learn off the sensitive data it is assessing for deletion, invertedly causing a privacy breach?
Needless to say, this is an area of law where cybersecurity professionals must watch closely.
Vendor Risk and Data Minimisation: Why Supply Chain Privacy Is Australia's Next Big Challenge
A concerning note from this talk was the discovery that the law currently places the responsibility of data minimisation on the organisation. While this is fantastic from a consumer perspective, it fails to capture the nuances of modern-day business engagements.
In a world where every component of the business was conducted in house, this responsibility would make sense. However, most organisations outsource components of their business (such as online collaboration platforms or cloud storage). Most of the time, this vendor holds the contractual power, and it can be difficult for the purchasing organisation to enforce and receive evidence of data minimisation activities.
Supply chain attacks are not uncommon, and it would be wise to watch this space for privacy law developments.
Privacy in Australia Is Evolving. Is Your Organisation Keeping Up?
Carly’s talk provoked room for discussion; the above being just the tip of the iceberg she presented to IAPP. “Privacy” may be a commonly understood term, and individuals and organisations may believe they have a tight grasp on their rights and obligations.
However, it is a continuously developing space, and technology evolves. Cybersecurity consultants must be at the forefront of these changes, adequately prepared to give their customers up-to-date advice to protect them not just from traditional cyber incidents, but privacy incidents as well.
