Sunita Bose, managing director of Digital Industry Group Inc., an advocate for the digital industry in Australia including X, Instagram, Facebook and TikTok, was answering questions at a single-day Senate committee hearing into world-first legislation that was introduced into the Parliament last week.
Bose said the Parliament should wait until the government-commissioned evaluation of age assurance technologies is completed in June.
“Parliament is asked to pass a bill this week without knowing how it will work,” Bose said.
The legislation would impose fines of up to 50 million Australian dollars ($33 million) on platforms for systemic failures to prevent young children from holding accounts.
It seems likely to be passed by Parliament by Thursday with the support of the major parties.
Discover the stories of your interest
It would take effect a year after the bill becomes law, allowing the platforms time to work out technological solutions that would also protect users’ privacy. Communications Minister Michelle Rowland said she looked forward to reading the Senate committee’s assessment of the proposed law, which “supports parents to say ‘no'” to children wanting to use social media.
“Social media in its current form is not a safe product for them,” Rowland told Parliament.
“Access to social media does not have to be the defining feature of growing up. There is more to life than constant notifications, endless scrolling and pressure to conform to the false and unrealistic perfectionism that can be served up by influencers,” she added.
Bose received heated questions from several senators and challenges to the accuracy of her answers.
Opposition Sen. Ross Cadell asked how his 10-year-old stepson was able to hold Instagram, Snapchat and YouTube accounts from the age of 8, despite the platforms setting a nominal age limit of 13.
Bose replied that “this is an area where the industry needs to improve.”
She said the proposed social media ban risked isolating some children and driving children to “darker, less safe online spaces” than mainstream platforms.
Bose said her concern with the proposed law was that “this could compromise the safety of young people,” prompting a hostile response from opposition Sen. Sarah Henderson.
“That’s an outrageous statement. You’re trying to protect the big tech giants,” Henderson said.
Unaligned Sen. Jacqui Lambie asked why the platforms didn’t use their algorithms to prevent harmful material being directed to children. The algorithms have been accused of keeping technology-addicted children connected to platforms and of flooding users with harmful material that promotes suicide and eating disorders.
“Your platforms have the ability to do that. The only thing that’s stopping them is themselves and their greed,” Lambie said.
Bose said algorithms were already in place to protect young people online through functions including filtering out nudity.
“We need to see continued investment in algorithms and ensuring that they do a better job at addressing harmful content,” Bose said.
Questioned by opposition Sen. Dave Sharma, Bose said she didn’t know how much advertising revenue the platforms she represented made from Australian children.
She said she was not familiar with research by the Harvard T.H. Chan School of Public Health that found X, Facebook, Instagram, TikTok, YouTube and Snapchat made $11 billion in advertising from U.S. users under 18 in 2022.
Communications department official Sarah Vandenbroek told the committee the evaluation of age assurance technologies that will report in June would assess not only their accuracy but also their security and privacy settings.
Department Deputy Secretary James Chisholm said officials had consulted widely before proposing the age limit.
“We think it’s a good idea and it can be done,” Chisholm told the committee.