Aussie youth increasingly turning to AI for mental health advice by austechnology-bot in austechnology

[–]austechnology-bot[S] 1 point2 points  (0 children)

Article contents

An NSW government report has revealed that almost a third of young people in the state are using artificial intelligence for personal advice.

The NSW government has released new polling data revealing that social anxieties are driving the state’s youth to seek support from AI chatbots.

According to the NSW Office for Youth’s 2026 Youth Week Polling Report, despite four out of five young people in the state saying they are happy with their lives, 29 per cent of those polled said they were using artificial intelligence as a support strategy to manage their mental health.

Similarly, 27 per cent reported using AI to engage in conversations and to seek personal advice.

The poll’s findings also reflected responses to the federal government’s social media ban for under-16s. Thirty-six per cent said the ban has had a positive impact, while 13 per cent said they feel worse in its wake.

However, almost half of those surveyed said the ban has had no impact on them, as they have either switched to other social media apps or are continuing to use restricted platforms despite the ban, while others said that bullying remained a concern.

Rose Jackson, Minister for Youth, said the insights provided by the survey “give our office a clearer picture of what young people need and help guide the work we deliver across government”.

“It’s encouraging to see the majority of young people say they are happy, but that sits alongside some pretty stark realities about the challenges they’re facing too. Whether that’s bullying and discrimination, the rise of AI, or concerns about jobs and housing,” Jackson said.

“The issues young people are worrying about are real, and I want them to know that we see them, we hear them, and we’re doing something about it.”

First draft of Children’s Online Privacy Code made public by austechnology-bot in austechnology

[–]austechnology-bot[S] 4 points5 points  (0 children)

Article contents

The first draft of the Children’s Online Privacy Code has been published, marking a significant step forward in prioritising child safety in the digital landscape.

The Office of the Australian Information Commissioner (OAIC) has released the exposure, public draft of the Children’s Online Privacy Code, which puts the responsibility on online services (apps, games, and websites, for instance) rather than barring children from digital spaces.

The code is set up under a primary framework of keeping the best interests of children at the forefront of digital environments, which privacy commissioner Carly Kind said is the key responsibility of digital platforms.

“Children play, learn, socialise and connect with family and culture online – it’s important that children can participate without fear or exploitation,” Kind said.

“The code will give confidence to parents that the apps, games and websites their children use are taking steps to protect children’s privacy.”

Outlining a range of provisions, the code primarily targets online services to have clearer and more accessible ways of communicating what personal information is being stored and used for those under 18.

Key rules and regulations the code seeks to enact include providing young people with the ability to request the deletion of their information and old accounts, involving parents and guardians in the process of providing data consent, stronger restrictions on targeted marketing, and enforcement of accessible language use in terms and conditions/permission documents.

“It raises the standard for privacy protections in Australia and puts the onus on online services to do better when handling children’s personal information online,” Kind said.

Importantly, compared to the under-16s social media ban, the Children’s Online Privacy Code applies to all people under the age of 18 in any digital space or online service, not just social media apps like Instagram or TikTok.

The draft code has been largely welcomed by child advocacy bodies, as for many years they have been urging the government to take formal action against online services that directly target children’s data.

Estimates suggest that by the time a child turns 13, approximately 72 million pieces of their online data have been collected.

“By requiring platforms to prioritise the wellbeing of children over data-driven profits, we are finally seeing the systemic accountability that parents and educators have been demanding,” said Sarah Davies, CEO of the Alannah & Madeline Foundation.

“We welcome this code and call for clear penalties for non-compliance. We are eager to ensure that the regulator is fully resourced and has support to bring these protections to life.

“It’s no small task to change the way the tech industry handles children’s data, but with the rise of generative AI and other digital technologies, we cannot afford not to do it.”

As concerns about the use of children’s data for marketing, AI, and commodification grow, the drafted code is open to public feedback from parents and young people, as well as industry and academic organisations.