Data and Privacy

AI for allied health: data sovereignty in Australia

If you are using AI tools with patient or participant data, you need to know where that data goes. Here is what Australian data sovereignty means and why it matters for your clinic.

AI tools for allied health are everywhere now. Report writers, transcription services, clinical note assistants, image analysis, appointment scheduling bots. They all process patient data through AI models. The question most practitioners do not ask: where does that data actually go?

Under the Privacy Act 1988 (amended December 2024) and the Australian Privacy Principles, health information has the highest level of protection. Sending participant data to AI infrastructure outside Australia creates compliance risks that most practitioners do not realise they are taking on.

01

What data sovereignty means in practice

Data sovereignty means your data is subject to the laws of the country where it is physically stored and processed. If participant data is processed by a server in the United States, it is subject to US law regardless of where your clinic is located or what your privacy policy says. This includes the US CLOUD Act (Clarifying Lawful Overseas Use of Data Act, 2018), which allows US government agencies to compel any US-headquartered company to hand over data it holds, regardless of where the servers are physically located. A US company storing data on Australian servers is still subject to the CLOUD Act.

02

The Privacy Act and health information

The Privacy Act 1988 classifies health information as 'sensitive information' with additional protections beyond ordinary personal information. Australian Privacy Principle 8 restricts cross-border disclosure: you must take reasonable steps to ensure overseas recipients handle data in accordance with the APPs, or obtain explicit informed consent from the individual. The December 2024 amendments added requirements for AI transparency. If you use AI tools that make or assist with decisions involving personal information, your privacy policy must now disclose this. A new statutory tort for serious invasion of privacy also creates a direct cause of action for individuals whose health data is mishandled. The practical implication: if you paste patient notes into a US-hosted AI tool and something goes wrong, you have personal liability.

03

Where most AI tools process data

Most AI tools marketed to Australian health practitioners process data through US-based cloud infrastructure. OpenAI (ChatGPT, GPT-4) processes data through Microsoft Azure, primarily in US data centres. Google's AI tools use US-based infrastructure unless specifically configured for Australian regions. Most startup AI tools built on top of OpenAI or Anthropic APIs inherit those companies' data processing locations. Unless the vendor explicitly confirms Australian data residency with documentation, assume the data leaves Australia. 'Our data is encrypted' is not the same as 'our data stays in Australia.' Encryption protects data in transit. Data sovereignty is about where the data is processed and which laws apply to it.

04

The OAIC guidance on AI and privacy (October 2024)

In October 2024, the Office of the Australian Information Commissioner published guidance specifically about using commercially available AI products with personal information. The key points: organisations must conduct a privacy impact assessment before using AI tools with personal information. You must ensure AI tools do not use your data to train their models (most free-tier AI tools do exactly this). You must be transparent with individuals about how AI is used in processing their information. And you must ensure any overseas data transfer complies with APP 8. For allied health practitioners, this means: do not paste identifiable patient data into any AI tool without first confirming where the data goes, whether it is used for model training, and whether you have patient consent for that specific use.

05

NDIS data has additional protections

NDIS participant data is not just health information under the Privacy Act. It is also 'protected information' under Section 60 of the NDIS Act 2013. Protected information includes anything obtained in connection with the NDIS that could identify a participant: their name, NDIS number, disability details, plan goals, service records, and assessment results. Unauthorised disclosure of protected information is an offence under the NDIS Act, separate from any Privacy Act breach. The penalties are separate too. If you send NDIS participant data to an offshore AI tool without appropriate safeguards, you are potentially breaching both the Privacy Act and the NDIS Act.

06

Questions to ask any AI vendor before using their tool

Before using any AI tool with patient or participant data, ask these specific questions: Where is my data processed? Name the cloud provider and the specific region (e.g. 'AWS ap-southeast-2, Sydney'). Is my data used to train or improve your AI models? If yes, do not use the tool with identifiable data. Where is my data stored at rest? Same region requirement as processing. Can you provide a Data Processing Agreement that specifies Australian data residency? Who at your company can access my data, and under what circumstances? Do you comply with the Australian Privacy Principles? If the vendor cannot answer these questions clearly or deflects with general statements about security, that is your answer.

07

Data residency is not the same as data sovereignty

A US company that stores data on Australian servers has data residency in Australia. But that company is still subject to US law, including the CLOUD Act. True data sovereignty requires that the data is processed and stored by infrastructure that is not subject to foreign government access orders. In practice, this means Australian-operated infrastructure or cloud providers with Australian legal entities that are not subsidiaries of US companies. This distinction matters less for low-risk data. It matters a lot for health information and NDIS protected information.

Key takeaways

Ask before you paste

Before putting any patient data into any AI tool, confirm where it goes. If the vendor cannot tell you the specific cloud region, do not use it with identifiable data. Use it with de-identified or synthetic data only.

Free tools are not free

Free-tier AI tools almost always use your data to improve their models. This is stated in their terms of service. Using a free AI tool with identifiable patient data means you are contributing health information to a training dataset. This is not compatible with the Australian Privacy Principles.

Read the terms of service, specifically the data section

Most practitioners accept terms without reading them. For AI tools used with patient data, read the data processing section specifically. Look for: data location, model training opt-out, data retention period, and third-party access.

Compliance is your responsibility, not the vendor's

The AI vendor is not registered with AHPRA. If patient data is mishandled, the regulatory risk sits with you and your clinic. The OAIC, AHPRA, and the NDIS Quality and Safeguards Commission will look at what you did, not what the vendor promised.

Reclaim your time. Start approving reports instead of writing them.

Set up your clinic in under 2 minutes. No credit card required.

Or email us at support@secondshift.com.au to book a personal onboarding call.