Jan. 26, 2026

Brett Lee

Brett Lee
The player is loading ...
Brett Lee
Spotify podcast player iconApple Podcasts podcast player icon
Spotify podcast player iconApple Podcasts podcast player icon

In this conversation, Brett Lee, a former police officer and current internet safety educator, shares his journey from law enforcement to educating children and parents about online safety. He emphasizes the importance of education in preventing child exploitation and discusses effective strategies for communicating risks to children. Brett highlights the need for schools to implement strong policies regarding technology use and the role of parents in guiding their children through the online world. He also addresses recent government regulations on social media and their potential impact on child safety.


Guest

ISE (Internet Safe Education) creator, Brett Lee, worked as a Queensland Police Officer for 22 years, 16 of those as a Detective predominantly in the field of Child Exploitation. In his last five years of service, he was a specialist in the field of undercover internet child exploitation investigations. Brett has been personally involved in the online investigation, arrest and prosecution of numerous offenders, whose medium for preying on children is the internet. Brett has delivered training to members of law enforcement agencies including New South Wales Police, South Australian Police, West Australian Police and Australian Customs. Within his own police service he was involved in the training of plain clothes police through Detective Training and Sexual Crimes courses in Online Child Exploitation investigations. Further to Brett's experience, he has completed the FBI Advanced Internet Investigations Course and has worked with the FBI Innocent Images Unit, Maryland USA, the Department of Homeland Security Cyber Crimes Centre, Virginia USA and the San Jose Internet Crimes Against Children Task Force, California USA. He has also attended and spoken at conferences around Australia and the world.


Takeaways

  • Education is crucial in preventing child exploitation.
  • Children often lack awareness of online dangers.
  • Parents should trust their instincts regarding online interactions.
  • Effective communication is key to understanding children's online experiences.
  • Schools need to enforce clear technology policies.
  • Personal devices can pose significant risks in schools.
  • Parents should set rules and boundaries for technology use at home.
  • Engaging parents in education about online safety is essential.
  • Government regulations can help protect children online.
  • The online world is a hidden space that requires vigilance. 


Host

Kevin Fullbrook is an international school leader with 25+ years of global education experience across Australia, China, and the Middle East. As host of The Leadership Passport Podcast, Kevin dives into the stories, strategies, and insights of education leaders from around the world. With a passion for inclusive leadership, student agency, and sustainable school cultures, he brings thoughtful conversations and practical takeaways for educators, aspiring leaders, and anyone interested in the future of learning.

Connect with him on Instagram (@kevin.fullbrook) and LinkedIn (Kevin Fullbrook)

https://www.linkedin.com/in/kevin-fullbrook-33034b8b/

https://www.instagram.com/kevin.fullbrook/


1
00:00:05,080 --> 00:00:06,720
Welcome to the Leadership
Passport.

2
00:00:06,800 --> 00:00:09,600
I'm delighted to welcome my
guest today, Mr. Brett Lee.

3
00:00:10,040 --> 00:00:13,520
So Internet Safe education
creator Brett Lee worked as a

4
00:00:13,520 --> 00:00:17,200
Queensland Police officer for 22
years, sixteen of those as a

5
00:00:17,200 --> 00:00:20,200
detective, predominantly in the
field of Child Exploitation.

6
00:00:20,880 --> 00:00:23,200
In his last five years of
service he was a specialist in

7
00:00:23,200 --> 00:00:25,960
the field of undercover Internet
Child Exploitation

8
00:00:25,960 --> 00:00:28,440
investigations.
Brett has been personally

9
00:00:28,440 --> 00:00:30,640
involved in the online
investigation, arrest and

10
00:00:30,640 --> 00:00:33,360
prosecution of numerous
offenders whose medium for

11
00:00:33,360 --> 00:00:35,040
praying on children is the
Internet.

12
00:00:35,760 --> 00:00:38,160
Brett has delivered training to
members of law enforcement

13
00:00:38,160 --> 00:00:42,200
agencies around Australia
including NSW, S, Australian W,

14
00:00:42,200 --> 00:00:45,240
Australian Police and Australian
Customs.

15
00:00:45,880 --> 00:00:48,560
Within his own police service,
he was involved in the training

16
00:00:48,560 --> 00:00:51,440
of plainclothes police through
detective training and sexual

17
00:00:51,440 --> 00:00:56,040
crimes courses in online Child
Exploitation investigations.

18
00:00:56,920 --> 00:00:59,680
Further to Brett's experience,
he's completed the FBI Advanced

19
00:00:59,680 --> 00:01:02,680
Internet Investigations course
and has worked with the FBI

20
00:01:02,840 --> 00:01:07,560
Innocent Images Unit in
Maryland, USA, the Department of

21
00:01:07,560 --> 00:01:11,280
Homeland Security Cyber Crimes
Center in Virginia, and the San

22
00:01:11,280 --> 00:01:13,520
Jose Internet Crimes Against
Children Task Force in

23
00:01:13,520 --> 00:01:16,120
California.
He's also attended and spoken at

24
00:01:16,120 --> 00:01:18,560
conferences around Australia and
the world.

25
00:01:18,640 --> 00:01:20,320
Welcome, Brett.
Great to have you here today.

26
00:01:21,000 --> 00:01:22,480
Thank you, Kevin.
It's great to be here.

27
00:01:23,680 --> 00:01:26,600
So before we talk about and sort
of get into the details of

28
00:01:26,600 --> 00:01:29,600
Internet safety and education,
can you briefly walk us through

29
00:01:29,600 --> 00:01:33,640
your journey from policing into
the work that you do today?

30
00:01:35,000 --> 00:01:37,960
Yeah, of course, when I started
policing, it was a very, very

31
00:01:37,960 --> 00:01:40,480
different world.
And technology evolved during my

32
00:01:40,480 --> 00:01:43,880
policing career and probably the
last five years that I was a

33
00:01:43,880 --> 00:01:46,440
police officer, I really got
involved heavily in it.

34
00:01:46,840 --> 00:01:49,800
I was part of a as a Detective
Sergeant.

35
00:01:49,800 --> 00:01:54,320
I was part of an undercover
Internet team and our job was to

36
00:01:54,320 --> 00:01:57,600
assume the fictitious identities
of children, go onto the

37
00:01:57,600 --> 00:02:01,800
Internet as it was then, and
locate, identify and arrest

38
00:02:01,800 --> 00:02:04,680
child sex offenders who who
have, of course, always been a

39
00:02:04,680 --> 00:02:08,560
part of our community.
But we're now utilizing the, you

40
00:02:08,560 --> 00:02:10,880
know, the capabilities of the
Internet to connect with an

41
00:02:10,880 --> 00:02:13,960
offend against children.
So in doing that job, it got me

42
00:02:13,960 --> 00:02:18,800
a very unique, you know, very
unique look at technology.

43
00:02:18,800 --> 00:02:21,320
Some of the things we don't see
the world behind the screen.

44
00:02:21,320 --> 00:02:25,200
So I started working in that in
Brisbane, in Queensland,

45
00:02:25,200 --> 00:02:28,840
Australia, I started working as
an undercover detective and I

46
00:02:28,840 --> 00:02:32,160
did that for five years until,
until I left.

47
00:02:32,160 --> 00:02:36,520
So like I said, it gave me a
very unique insight not only to

48
00:02:36,920 --> 00:02:40,440
how technology works, but the
human aspect to it as well.

49
00:02:41,880 --> 00:02:46,080
It sounds like incredible must
be an incredibly difficult work

50
00:02:46,080 --> 00:02:48,680
and challenging work.
What kind of initially drew you

51
00:02:48,680 --> 00:02:51,960
into that?
Look, it was an interest in

52
00:02:51,960 --> 00:02:56,440
technology that I was aware of
the unit that had just started

53
00:02:56,440 --> 00:02:59,520
to do it.
So I thought this is something

54
00:02:59,520 --> 00:03:02,960
that, you know, I had that
passion for protecting children

55
00:03:02,960 --> 00:03:06,280
because I came from the Child
and Sexual Assault Unit,

56
00:03:06,280 --> 00:03:10,120
Brisbane Police Headquarters.
So I, I sort of wanted to expand

57
00:03:10,120 --> 00:03:12,800
on that a bit further.
And of course, like a lot of

58
00:03:12,800 --> 00:03:17,760
people, it's the curiosity, you
know, of that online world has

59
00:03:17,760 --> 00:03:21,600
capabilities, its limitations.
And I noticed a lot of people

60
00:03:21,600 --> 00:03:26,040
around me had really no idea
what we were doing or, or what

61
00:03:26,040 --> 00:03:28,840
it really meant.
And during the podcast, I'll

62
00:03:28,840 --> 00:03:31,840
probably outline some of that.
Yeah.

63
00:03:31,840 --> 00:03:35,280
So I, I just applied and I got
selected and I was part of that

64
00:03:35,280 --> 00:03:37,760
unit.
And of course we did extensive

65
00:03:37,760 --> 00:03:40,040
training in relation to it
together.

66
00:03:40,040 --> 00:03:42,720
Collect, store and put evidence
in a format where it's

67
00:03:42,720 --> 00:03:45,520
admissible to court.
Yeah, it was sort of.

68
00:03:45,520 --> 00:03:47,400
I felt it was like a natural
progression.

69
00:03:47,720 --> 00:03:51,520
And, and was there a particular
moment, you know, when you were

70
00:03:51,520 --> 00:03:54,720
coming to perhaps the end of, of
your service as a detective

71
00:03:54,720 --> 00:03:58,760
there when you realise that that
education and, and rather than

72
00:03:58,760 --> 00:04:02,400
on the enforcement investigation
side was it could be really

73
00:04:02,400 --> 00:04:06,360
powerful way for, you know,
protecting children or children

74
00:04:06,360 --> 00:04:09,520
to protect themselves.
That's a really good question

75
00:04:09,520 --> 00:04:12,560
because that is just spot on.
That's exactly what happened.

76
00:04:12,560 --> 00:04:16,279
So, you know, we were only
limited by time we were

77
00:04:16,279 --> 00:04:19,240
arresting child sex offenders
from the Internet every single

78
00:04:19,240 --> 00:04:23,640
day of the week.
The and and we sort of realized

79
00:04:23,640 --> 00:04:25,720
that we can't arrest our way out
of this problem.

80
00:04:26,120 --> 00:04:32,680
But what was really the trigger
for me is our officer in charge

81
00:04:32,680 --> 00:04:36,480
requested we go to a school and
deliver a cyber safety

82
00:04:36,480 --> 00:04:40,880
presentation, which I'd never
done before to a group of year

83
00:04:40,880 --> 00:04:44,640
10 female students at an all
girls school.

84
00:04:45,240 --> 00:04:48,920
And we just went to the school
and we sort of outlined in, in,

85
00:04:49,000 --> 00:04:52,720
you know, I'm very accessible,
you know, and acceptable terms

86
00:04:52,720 --> 00:04:57,080
to year 10 girls who are around
15 years of age, what our job

87
00:04:57,080 --> 00:04:58,720
entailed and what we were
witnessing.

88
00:04:58,720 --> 00:05:02,240
And the level of, and, and I
don't say this in a derogatory

89
00:05:02,240 --> 00:05:06,800
way, the level of ignorance, not
only to, well, not really how

90
00:05:06,800 --> 00:05:12,440
technology works, but to the
lack of life skills.

91
00:05:12,440 --> 00:05:16,960
And I can remember 115 year old
girl said to me, or do people

92
00:05:16,960 --> 00:05:20,640
really want to do that to other
people, to 15 year old girls?

93
00:05:22,200 --> 00:05:24,440
That sort of blew me away.
And I thought, well, I just

94
00:05:24,440 --> 00:05:26,600
assumed you knew there were
people out there who were going

95
00:05:26,600 --> 00:05:29,320
to harm you in a number of
different ways.

96
00:05:29,320 --> 00:05:33,360
And that's when I realized how
vulnerable children can be, not

97
00:05:33,360 --> 00:05:35,840
because they can't use
technology, because they have a

98
00:05:35,840 --> 00:05:39,320
different view of the world that
they didn't have our allied

99
00:05:39,320 --> 00:05:41,680
skills.
So, you know, very accepting and

100
00:05:41,680 --> 00:05:43,760
very trusting.
And I thought, well, this, this

101
00:05:43,760 --> 00:05:46,760
is the key.
If we can remove potential

102
00:05:46,760 --> 00:05:50,560
victims, we can make them
smarter and remove the victims.

103
00:05:50,560 --> 00:05:53,800
Even though there may be
predators online, there's no way

104
00:05:53,800 --> 00:05:57,400
they can accomplish their, you
know, their illegal, you know,

105
00:05:57,400 --> 00:05:59,000
the things I want to do that are
illegal.

106
00:05:59,360 --> 00:06:02,760
So that really, really gave me
the idea that, you know,

107
00:06:03,120 --> 00:06:07,240
education is the key,
enforcement is very important

108
00:06:07,680 --> 00:06:11,160
and it is ineffective in all.
It is effective in a lot of ways

109
00:06:11,160 --> 00:06:13,160
where we need to remove these
people, they need to be

110
00:06:13,160 --> 00:06:15,000
accountable, they need to be
identified.

111
00:06:15,560 --> 00:06:19,160
But to make that real community
change, it's really the

112
00:06:19,160 --> 00:06:23,840
education.
So, you know, talking about like

113
00:06:23,840 --> 00:06:29,240
that, that first, you know,
presentation and and education

114
00:06:29,240 --> 00:06:31,480
session for the for the kids in
your subsequent ones.

115
00:06:31,840 --> 00:06:34,040
Talk us through a little bit the
kinds of things that you cover.

116
00:06:34,040 --> 00:06:37,120
What do you and I've been lucky
enough we were just chatting

117
00:06:37,120 --> 00:06:41,080
before to have you in in one of
the schools I was in and talked

118
00:06:41,080 --> 00:06:44,640
to our kids and it was
incredibly powerful and eye

119
00:06:44,640 --> 00:06:46,520
opening for those kids.
Could you run us through a

120
00:06:46,520 --> 00:06:48,960
little bit about kind of what
you're covering and what you

121
00:06:48,960 --> 00:06:51,080
share with them?
Because it does shock a lot of

122
00:06:51,080 --> 00:06:56,480
the kids in not in a like a like
an explicit way or an over the

123
00:06:56,480 --> 00:07:00,680
top way, but it really shocks
them in terms of the kinds of

124
00:07:00,680 --> 00:07:02,960
things that are that are
possible and that other people

125
00:07:02,960 --> 00:07:05,520
want to do like you said.
Yeah, look, exactly.

126
00:07:06,120 --> 00:07:08,720
Messages should never be
designed to create fear.

127
00:07:09,080 --> 00:07:12,640
But I think it's important that
everyone in our community knows

128
00:07:12,640 --> 00:07:15,000
the world they're in.
They know the truth because

129
00:07:15,000 --> 00:07:17,920
that's what gives them the
passion and reason to know what

130
00:07:17,920 --> 00:07:20,240
role they need to play.
Whether it's a child protecting

131
00:07:20,240 --> 00:07:23,240
themselves, whether it's a
teacher, an educator or a parent

132
00:07:23,480 --> 00:07:26,840
playing a role to protect
children, people really need to

133
00:07:26,840 --> 00:07:30,840
know the dangers that exist.
And I think a lot of there are

134
00:07:30,840 --> 00:07:34,680
people and there's, there's
areas in our community that I, I

135
00:07:34,680 --> 00:07:38,520
don't think give kids the credit
they deserve that, you know,

136
00:07:38,520 --> 00:07:41,880
let's shelter them from these
days from the knowledge that

137
00:07:41,880 --> 00:07:45,520
these dangers exist.
I don't agree with that.

138
00:07:45,520 --> 00:07:48,760
I think they're quite resilient.
I think when young people know

139
00:07:48,760 --> 00:07:51,560
there's a reason why these
things are being put in place,

140
00:07:52,880 --> 00:07:55,240
you know, that that's why
they're being put in place.

141
00:07:55,240 --> 00:07:58,720
So I think that's, I think
that's important that we, we

142
00:07:58,920 --> 00:08:02,880
highlight, you know, the dangers
that exist, not to say any

143
00:08:03,240 --> 00:08:05,800
particular people or not to say
the internet's bad, but these

144
00:08:05,800 --> 00:08:08,400
dangers are real because it's
the truth.

145
00:08:08,400 --> 00:08:12,400
So how I would deliver my
messages to children was first

146
00:08:12,400 --> 00:08:15,240
to what you said, in an
acceptable, accessible way,

147
00:08:17,280 --> 00:08:19,120
depending on what age group it
is.

148
00:08:19,120 --> 00:08:22,280
You know, we deliver the primary
schools like kids right down in

149
00:08:22,280 --> 00:08:27,480
year 1/2, right up to year 12.
But yeah, we don't over

150
00:08:27,480 --> 00:08:30,200
sensationalize.
We just let them know, you know,

151
00:08:30,280 --> 00:08:32,640
our experience and the dangers
that do exist.

152
00:08:32,640 --> 00:08:36,840
And we just go through the
strategies in how to avoid those

153
00:08:36,840 --> 00:08:40,400
dangers while still being a part
of the online world, letting

154
00:08:40,400 --> 00:08:41,919
them know it's not going to be
perfect.

155
00:08:41,919 --> 00:08:45,760
You could have issues, but is
the skills to fix those issues

156
00:08:45,760 --> 00:08:47,840
and move on.
This is what's open to you to

157
00:08:47,840 --> 00:08:50,200
make that world safe.
These are the things you should

158
00:08:50,200 --> 00:08:52,960
avoid.
So, you know, letting them know

159
00:08:52,960 --> 00:08:55,440
the dangers that exist, how to
avoid those dangers.

160
00:08:55,440 --> 00:08:58,600
And if they have problems,
having the skills and sometimes

161
00:08:58,600 --> 00:09:02,680
the courage to take action to to
fix those problems.

162
00:09:02,960 --> 00:09:05,280
And I really think it's as
simple as that because I say to

163
00:09:05,280 --> 00:09:07,040
parents, what are we trying to
achieve here?

164
00:09:07,040 --> 00:09:10,880
Not kick kids off the Internet,
but for them to be able to, you

165
00:09:10,880 --> 00:09:14,720
know, experience all the great
stuff whilst reducing risk.

166
00:09:15,120 --> 00:09:17,480
And I don't think we need to
overcomplicate it.

167
00:09:17,480 --> 00:09:20,080
And particularly for young
people, they need something that

168
00:09:20,080 --> 00:09:23,240
you know they can grasp onto,
that they can action, they can

169
00:09:23,240 --> 00:09:25,560
use.
I remember from some of your

170
00:09:25,560 --> 00:09:29,600
workshops that you give you
share some specific examples of

171
00:09:29,600 --> 00:09:34,600
what of kind of how kids can
find themselves getting into

172
00:09:34,600 --> 00:09:36,520
strife and getting into some
trouble.

173
00:09:37,960 --> 00:09:42,160
And before they know it, they're
kind of in, in, you know, pretty

174
00:09:42,160 --> 00:09:44,880
deep into things.
And it can be hard for for

175
00:09:44,880 --> 00:09:46,720
either kids or parents or
teachers who aren't, maybe

176
00:09:46,720 --> 00:09:49,520
aren't familiar with technology
as social media and some of the

177
00:09:49,520 --> 00:09:52,480
platforms that are out there to
know what what that kind of

178
00:09:52,480 --> 00:09:55,280
looks like and, and to recognize
maybe some of the signals or

179
00:09:55,280 --> 00:09:59,760
signs that, you know, they're
getting themselves into a

180
00:09:59,760 --> 00:10:02,600
difficult position with someone
maybe who's not, who's

181
00:10:02,600 --> 00:10:04,600
representing themselves as
someone who they're not.

182
00:10:04,600 --> 00:10:07,120
I'm wondering if maybe you could
share one or two examples from

183
00:10:07,120 --> 00:10:11,240
your experience of what that
kind of a situational process

184
00:10:11,240 --> 00:10:15,080
unfolds like.
Yeah, look, I because I live

185
00:10:15,080 --> 00:10:18,080
this every day and I I think
about it constantly.

186
00:10:18,840 --> 00:10:21,720
I, I really don't think the
human brain has been set up

187
00:10:22,400 --> 00:10:29,000
evolutionary wise or creation
wise to handle the nature of the

188
00:10:29,000 --> 00:10:32,960
online world in that the
Internet exists in our mind.

189
00:10:33,080 --> 00:10:37,720
So it's very hard We're we're
really seeing is believing and

190
00:10:37,720 --> 00:10:40,840
Amon tries to create scenarios
based on the information it

191
00:10:40,840 --> 00:10:43,040
receives.
And quite often, as you

192
00:10:43,040 --> 00:10:46,400
mentioned, that information may
not be accurate when we're using

193
00:10:46,480 --> 00:10:48,280
the Internet.
So it's quite hard to

194
00:10:48,280 --> 00:10:51,400
contextualize what's really
happening with technology.

195
00:10:51,760 --> 00:10:53,800
And there are people that will
take advantage of that.

196
00:10:55,320 --> 00:10:59,800
I've arrested hundreds of
adults, generally males.

197
00:10:59,800 --> 00:11:03,320
They're nearly always males who
are using technology to target

198
00:11:03,320 --> 00:11:05,400
children.
And my job was to pretend to be

199
00:11:05,400 --> 00:11:09,200
a child.
Every single person I arrested,

200
00:11:09,680 --> 00:11:12,280
I believe, believe that I was a
child.

201
00:11:12,400 --> 00:11:16,920
Beyond all that, I was a child
to them that was as real to them

202
00:11:16,920 --> 00:11:18,640
as someone walking around in the
street.

203
00:11:19,000 --> 00:11:21,960
How I achieved that was I told
him I was a child.

204
00:11:22,640 --> 00:11:26,360
I sent fake pictures as a child.
I told them what they wanted to

205
00:11:26,360 --> 00:11:29,640
hear, so they've been more
inclined to believe me and

206
00:11:29,640 --> 00:11:35,800
communicate with me.
And I spoke like a child.

207
00:11:36,200 --> 00:11:39,360
And that was enough for them to
believe that they knew who they

208
00:11:39,360 --> 00:11:42,040
were talking to.
Every single one of them, some

209
00:11:42,040 --> 00:11:45,400
of them were struggling to
comprehend when we arrested

210
00:11:45,400 --> 00:11:49,320
them, was struggling to
comprehend that that child

211
00:11:49,320 --> 00:11:52,240
didn't exist.
In their mind, that child was

212
00:11:52,240 --> 00:11:54,640
real.
Now, this is an adult person.

213
00:11:55,200 --> 00:11:59,240
Now, if we can transpose that to
young people, imagine the

214
00:11:59,240 --> 00:12:02,520
challenges that they're going to
have when looking at a screen,

215
00:12:02,520 --> 00:12:07,560
processing that information with
the developing brain, with life

216
00:12:07,560 --> 00:12:09,360
skills that aren't as developed
as us.

217
00:12:09,360 --> 00:12:12,600
Very trusting, accepting.
There's things they're going to

218
00:12:12,600 --> 00:12:18,120
really want out of life to
develop into who they believe

219
00:12:18,120 --> 00:12:20,920
they are.
They want to be accepted and as

220
00:12:21,000 --> 00:12:23,400
humans, we've all got needs and
vulnerabilities.

221
00:12:23,400 --> 00:12:26,160
And you know, some of the needs
and vulnerabilities I've

222
00:12:26,160 --> 00:12:28,600
identified that you and your
audience would know better than

223
00:12:28,600 --> 00:12:31,720
me being educators said they
want to be accepted.

224
00:12:31,720 --> 00:12:33,680
They want to move on to the next
level.

225
00:12:33,960 --> 00:12:36,840
They want to feel that there's
someone, they want to be famous.

226
00:12:37,960 --> 00:12:43,640
So they become very vulnerable
in relation to, you know,

227
00:12:43,680 --> 00:12:48,080
interacting with people online.
So really the only way we can,

228
00:12:48,120 --> 00:12:50,560
we can target this, if children
are going to be part of an

229
00:12:50,560 --> 00:12:55,360
online world, is to pass on, you
know, generic messages of

230
00:12:55,360 --> 00:12:57,680
substance.
It's to get young people to

231
00:12:57,680 --> 00:13:00,560
think.
I say to young people, the

232
00:13:00,560 --> 00:13:02,160
Internet only exists in one
place.

233
00:13:02,160 --> 00:13:04,600
It's in someone's mind.
And when someone makes a choice

234
00:13:04,600 --> 00:13:06,440
online, it's not because they
slipped.

235
00:13:07,840 --> 00:13:12,080
It's that bad that choice didn't
start in their fingertips, that

236
00:13:12,080 --> 00:13:14,760
bad choice or that choice didn't
start on the screen.

237
00:13:14,760 --> 00:13:18,280
It started here.
So that's what we have to work

238
00:13:18,280 --> 00:13:20,960
on.
Nobody online, no matter who

239
00:13:20,960 --> 00:13:24,840
they are, no matter how bad, no
matter how big, can make us make

240
00:13:25,040 --> 00:13:27,760
a choice.
That's always going to be us

241
00:13:28,160 --> 00:13:30,280
chopping the key.
So we want our kids to think the

242
00:13:30,280 --> 00:13:33,680
right way so they know there's
choices I do and don't make.

243
00:13:34,240 --> 00:13:37,840
But that example of, you know,
those criminals say they just

244
00:13:37,840 --> 00:13:39,920
accept it, whatever I put on the
screen.

245
00:13:40,120 --> 00:13:43,200
And these were adult people and
you would assume they knew they

246
00:13:43,200 --> 00:13:45,800
were doing the wrong thing.
So you would assume they'd be on

247
00:13:45,800 --> 00:13:49,600
the lookout for for, you know,
maybe a police officer doing

248
00:13:49,600 --> 00:13:51,680
this or someone not being who
they are.

249
00:13:52,000 --> 00:13:55,040
But none of them questioned it.
They just accepted it.

250
00:13:55,560 --> 00:13:59,840
So if we can, you know, really
target, you know, the mindset of

251
00:13:59,840 --> 00:14:02,840
our children, which with some
basic generic messages, I, I

252
00:14:02,840 --> 00:14:05,680
think that's what's going to
empower them to be able to use

253
00:14:05,680 --> 00:14:07,280
the Internet.
Because like I said to parents,

254
00:14:07,280 --> 00:14:09,000
you can't be there 24 hours a
day.

255
00:14:09,760 --> 00:14:13,120
Your role and a role as an
educator is to give kids

256
00:14:13,120 --> 00:14:16,680
everything they need to put them
in a position where they can

257
00:14:16,680 --> 00:14:21,360
make the right voice themselves.
So what do you think?

258
00:14:21,360 --> 00:14:23,960
From from your experience, what
does schools most commonly

259
00:14:23,960 --> 00:14:28,920
underestimate about online risk?
Look like I think in a large

260
00:14:28,920 --> 00:14:31,480
regard, one of my messages is
that the internet's public and

261
00:14:31,480 --> 00:14:35,600
we know that, but in a large
regards in it's a very hidden

262
00:14:35,600 --> 00:14:38,240
world.
So each person because it's up

263
00:14:38,240 --> 00:14:42,080
here has their own experience.
And I think when, when adults

264
00:14:42,080 --> 00:14:45,920
particularly, you know, law
abiding, responsible adults who

265
00:14:45,920 --> 00:14:48,600
had the kids interest best at
heart, like parents and carers

266
00:14:48,600 --> 00:14:53,360
and teachers, we assume that the
experience we have is similar to

267
00:14:53,360 --> 00:14:55,320
the experience our children
have.

268
00:14:56,040 --> 00:14:59,440
So I've had parents who have
said to me, oh, you know, I've

269
00:14:59,440 --> 00:15:01,480
had a good time in Facebook.
My child's going to have a good

270
00:15:01,480 --> 00:15:04,320
time in Facebook.
That's absolutely not the truth.

271
00:15:04,760 --> 00:15:06,720
Because they're at a different
stage of their life.

272
00:15:06,720 --> 00:15:08,480
They're looking to get different
things from it.

273
00:15:08,480 --> 00:15:10,200
They're going to connect with
different people.

274
00:15:10,200 --> 00:15:12,520
They're not thinking the same.
They're not, they're they're for

275
00:15:12,520 --> 00:15:15,880
the same purposes.
So I, I think we can

276
00:15:15,880 --> 00:15:20,400
underestimate some of the, the
effects that are the screen that

277
00:15:20,400 --> 00:15:23,400
technology, that interaction can
have on a young person, whether

278
00:15:23,480 --> 00:15:26,440
predatory behaviour, whether
it's cyber bullying.

279
00:15:26,680 --> 00:15:30,160
I have so many parents say to
me, but my kids were cyber

280
00:15:30,160 --> 00:15:33,520
bullied that just delete it.
That just blocked that person.

281
00:15:33,520 --> 00:15:36,360
They'd move away, which is
absolutely not the case.

282
00:15:36,760 --> 00:15:40,960
Now, I've spoken of some very
clever psychologists, child

283
00:15:40,960 --> 00:15:44,360
psychologists who say human
nature, particularly children,

284
00:15:44,360 --> 00:15:46,760
they can't do that.
They're drawn to that content.

285
00:15:47,040 --> 00:15:48,360
They want to see what's
happening.

286
00:15:48,360 --> 00:15:51,320
They think the more they look at
it, the more they're going to be

287
00:15:51,320 --> 00:15:53,960
able to put it into perspective.
But it just compounds the

288
00:15:53,960 --> 00:15:57,360
negative effects.
But then we've got, you know

289
00:15:57,360 --> 00:16:00,680
what We can self manage better
as well because their brains are

290
00:16:00,680 --> 00:16:04,440
fully developed and children can
be very susceptible to

291
00:16:04,440 --> 00:16:09,640
addiction, you know, to being
drawn to content that's outside

292
00:16:10,520 --> 00:16:13,640
what we could consider as
normal, common, proprietary in

293
00:16:13,640 --> 00:16:17,520
society, like language and how
to treat people and what has a

294
00:16:17,520 --> 00:16:19,520
bigger effect on a young
person's mind.

295
00:16:19,520 --> 00:16:24,640
So I'm never I'm never negative
when I talk to educators or

296
00:16:24,640 --> 00:16:27,320
parents.
See, we didn't grow up in that

297
00:16:27,320 --> 00:16:29,320
world as a young person being
exposed to it.

298
00:16:29,320 --> 00:16:31,960
So we really don't know the
effects that we have.

299
00:16:31,960 --> 00:16:38,000
So I think it's because also we
grew up in a world where the

300
00:16:38,000 --> 00:16:43,400
dangers were generally physical.
Now they're hugely, potentially

301
00:16:43,520 --> 00:16:46,720
hugely psychological.
And I don't think we really

302
00:16:46,720 --> 00:16:49,680
understand the importance of,
you know, playing that role, but

303
00:16:49,680 --> 00:16:53,040
educating the children or
understanding some of it, you

304
00:16:53,040 --> 00:16:57,400
know, that the negatively life
changing issues that technology

305
00:16:57,680 --> 00:16:59,280
can potentially create.
Yeah.

306
00:16:59,280 --> 00:17:02,000
And you bring up a good point
there when, you know, we were,

307
00:17:02,000 --> 00:17:05,520
we were growing up, you know,
there was, you know, there

308
00:17:05,520 --> 00:17:08,200
wasn't the Internet.
It wasn't really around that in

309
00:17:08,200 --> 00:17:11,839
the way that it is now.
And but you know, those kind of

310
00:17:12,319 --> 00:17:16,240
people who sought to do harm and
do the wrong thing, you know,

311
00:17:16,520 --> 00:17:19,720
that was more of a in a physical
space.

312
00:17:19,720 --> 00:17:21,720
I remember growing up in
Australia, there was a campaign

313
00:17:21,720 --> 00:17:24,560
around stranger danger and
around safe houses.

314
00:17:25,160 --> 00:17:27,800
You know, people would talk to
you about how to keep as a

315
00:17:27,800 --> 00:17:29,440
child, how to keep yourself
safe.

316
00:17:29,440 --> 00:17:32,560
That's signs to look out for
the, the things to be aware of

317
00:17:32,560 --> 00:17:36,560
and then what to do.
So what does that, what's does

318
00:17:36,560 --> 00:17:39,760
that advice look like for kids
these days in technology in

319
00:17:39,760 --> 00:17:42,920
terms of, you know, what are the
things to look out for?

320
00:17:42,920 --> 00:17:45,920
What are the signs?
And then what do you do?

321
00:17:45,920 --> 00:17:50,240
If if you do recognize something
that doesn't look or feel right,

322
00:17:50,440 --> 00:17:54,520
then what do you do?
Look, I think for a parent and

323
00:17:54,520 --> 00:17:58,560
and a teacher, it's important to
use the the skills that you

324
00:17:58,560 --> 00:18:02,480
already possess and not to let
technology slip under the radar

325
00:18:02,480 --> 00:18:04,480
and make you believe that
because someone's behind a

326
00:18:04,480 --> 00:18:08,440
screen, something's changed.
So all the dangers that were

327
00:18:08,440 --> 00:18:11,800
presented to us in the physical
world and the systems that were

328
00:18:11,800 --> 00:18:14,360
put in place to protect us
there, we should be thinking the

329
00:18:14,360 --> 00:18:17,000
same online.
The dangers are exactly the

330
00:18:17,000 --> 00:18:19,880
same.
So you think, would it be OK for

331
00:18:19,880 --> 00:18:24,720
someone to walk up to my child,
an adult, and asking for their

332
00:18:24,720 --> 00:18:26,760
details down at the shopping
center?

333
00:18:27,360 --> 00:18:29,440
If the answer is well, that
would be horrifying.

334
00:18:29,440 --> 00:18:31,360
Of course not, because that's
not normal.

335
00:18:31,600 --> 00:18:33,800
That's not OK.
They don't have that right?

336
00:18:33,920 --> 00:18:35,880
We should be thinking the same
online.

337
00:18:36,560 --> 00:18:40,800
Now we don't panic, but we want
to educate our children to that,

338
00:18:40,800 --> 00:18:43,720
that it's not OK just because
someone's behind a screen.

339
00:18:44,320 --> 00:18:48,400
So I think we think the same as
parents, we're very and as

340
00:18:48,400 --> 00:18:53,160
educators, we, we look very
carefully at online contacts, no

341
00:18:53,160 --> 00:18:56,960
matter who, because that's where
nearly all the problems exist.

342
00:18:57,280 --> 00:18:59,920
I mean, a child may see
something that's disturbing, but

343
00:18:59,920 --> 00:19:02,520
that's something that can be
managed and we can move past

344
00:19:02,520 --> 00:19:04,160
that.
But if they're connected to

345
00:19:04,160 --> 00:19:07,000
people online who for no reason
want to become part of their

346
00:19:07,000 --> 00:19:10,600
life, want their personal
information, are promising to

347
00:19:10,600 --> 00:19:14,360
give them things, I can
guarantee you having spent

348
00:19:14,400 --> 00:19:17,600
thousands of hours communicating
with online child sex offenders,

349
00:19:17,960 --> 00:19:20,760
that that is a person with I'll
intention because that's not

350
00:19:20,760 --> 00:19:22,800
normal.
That's not OK.

351
00:19:22,800 --> 00:19:27,440
There's no reason for that.
So again, we don't panic.

352
00:19:27,680 --> 00:19:30,520
Is that person who they say they
are and they're just interested?

353
00:19:30,880 --> 00:19:34,920
I don't know and I don't care
because they're not part of my

354
00:19:34,920 --> 00:19:37,800
child's life, so they don't need
to communicate with them.

355
00:19:38,440 --> 00:19:41,040
So looking for people who are
trying to get access, looking

356
00:19:41,040 --> 00:19:45,360
for people who want to, who are
trying to pique our children's

357
00:19:45,400 --> 00:19:48,400
interests because children are
very curious.

358
00:19:49,120 --> 00:19:55,520
People who are interacting in a
sexualized way online as well.

359
00:19:55,520 --> 00:19:58,800
That gives predators.
Predators do that for a number

360
00:19:58,800 --> 00:20:00,560
of reasons.
One, they can identify a child

361
00:20:00,560 --> 00:20:04,120
who's prepared to take risk.
It desensitizes the child and it

362
00:20:04,120 --> 00:20:09,200
excites them, the predator.
So we we don't just dismiss

363
00:20:09,200 --> 00:20:11,160
that.
We don't think, oh, this is just

364
00:20:11,160 --> 00:20:14,960
young people, you know, talking
in a way that young people talk.

365
00:20:14,960 --> 00:20:19,760
No, that that's not OK.
And the other thing is looking

366
00:20:19,920 --> 00:20:23,480
at people who befriend their
children in public spaces.

367
00:20:24,040 --> 00:20:27,360
Such as social media and gaming
sites and video sharing sites,

368
00:20:27,400 --> 00:20:32,920
and then want to move our child
from there to start interacting

369
00:20:33,200 --> 00:20:37,160
in a private messaging program.
So we want to remove private

370
00:20:37,160 --> 00:20:41,280
relationships and friendships.
It's OK for kids to play games.

371
00:20:41,280 --> 00:20:44,040
It's OK to inter interest
themselves or interact with

372
00:20:44,040 --> 00:20:46,640
people online as long as they
treat them for who they are.

373
00:20:46,640 --> 00:20:49,080
They're a stranger.
If that person walked up to me

374
00:20:49,080 --> 00:20:52,160
in the street and asked me where
I lived, that's not OK.

375
00:20:52,320 --> 00:20:55,520
Then it doesn't automatically
become OK when it's online.

376
00:20:55,880 --> 00:21:00,000
And what I generally say is a
lot of parents and teachers

377
00:21:00,000 --> 00:21:01,680
already have the skills they
need.

378
00:21:01,680 --> 00:21:03,600
It's about trusting those
instincts.

379
00:21:05,200 --> 00:21:07,600
I say to a parent, you may not
even know how to communicate.

380
00:21:07,680 --> 00:21:10,960
You turn a computer on, but if
something inside you tells you

381
00:21:10,960 --> 00:21:12,480
something's wrong, something is
wrong.

382
00:21:12,960 --> 00:21:15,520
And generally starts with
conversation, of course.

383
00:21:15,520 --> 00:21:18,920
So and I let parents and
teachers know you're not being a

384
00:21:18,920 --> 00:21:20,600
bad parent.
You're not saying you don't

385
00:21:20,600 --> 00:21:22,800
trust your kids.
You're saying you identify that

386
00:21:22,800 --> 00:21:25,760
you don't trust the nature of
the world we live in and we want

387
00:21:25,760 --> 00:21:32,000
our kids to feel safe.
So it's that, it's that personal

388
00:21:32,000 --> 00:21:36,120
interaction online that's one of
the big things to look out for.

389
00:21:37,840 --> 00:21:42,240
So you've been into hundreds,
probably thousands of schools

390
00:21:42,280 --> 00:21:44,360
over, you know it, many, many
years.

391
00:21:44,840 --> 00:21:47,520
What are some things that you
see when you go into a school

392
00:21:47,560 --> 00:21:51,480
and you recognize this school
really gets it in terms of the,

393
00:21:51,520 --> 00:21:54,360
the, the community or the people
in the school in terms of their

394
00:21:54,360 --> 00:21:58,160
approaches to education around
online safety and those sorts of

395
00:21:58,160 --> 00:21:59,200
things.
What are, what are some of the

396
00:21:59,400 --> 00:22:02,360
those key features where you
see, all right, well, this, this

397
00:22:02,360 --> 00:22:04,560
school, this community is pretty
switched on in terms of how

398
00:22:04,560 --> 00:22:07,840
they're approaching this stuff.
Look, things are changing and

399
00:22:08,000 --> 00:22:11,760
people sort of didn't know the
nature of how to manage it in a

400
00:22:11,760 --> 00:22:13,600
school.
I've been going into schools for

401
00:22:13,600 --> 00:22:15,720
17 years and I've seen a lot of
change.

402
00:22:16,000 --> 00:22:18,440
The first thing I, I noticed
when I go into a school

403
00:22:18,440 --> 00:22:20,920
sometimes these, these times I
say, how's it going at school

404
00:22:20,920 --> 00:22:24,320
with kids and technology?
The school, it says to me, look,

405
00:22:24,320 --> 00:22:28,120
we deal with so many issues,
personal issues, self esteem

406
00:22:28,120 --> 00:22:32,800
issues, anger issues,
interpersonal skills, consent

407
00:22:32,800 --> 00:22:35,240
issues with our students.
And I say, well, how's this

408
00:22:35,240 --> 00:22:37,920
happening?
Every single time?

409
00:22:37,920 --> 00:22:42,600
The answer is the same social
media every single time.

410
00:22:42,960 --> 00:22:46,200
They say it takes up 80% of
their time when it comes to

411
00:22:46,200 --> 00:22:49,800
pastoral, when it comes to, you
know, looking after kids.

412
00:22:49,800 --> 00:22:52,680
well-being social media, that's
the first one.

413
00:22:53,000 --> 00:22:55,240
Now, where are they getting
access to this social media?

414
00:22:55,360 --> 00:22:57,880
Not on school devices that they
used to learn.

415
00:22:58,120 --> 00:23:00,480
It's on their own personal
devices.

416
00:23:01,120 --> 00:23:03,680
Like I said, times are changing.
When I went in, when I was going

417
00:23:03,680 --> 00:23:07,280
to school 17 years ago, not one
school in Australia had an

418
00:23:07,280 --> 00:23:10,200
acceptable user policy when it
came to technology.

419
00:23:10,920 --> 00:23:14,800
I think we assume that if we
give every teenager in the world

420
00:23:14,800 --> 00:23:18,520
an Internet able device, they're
going to use it to learn, which

421
00:23:18,560 --> 00:23:23,000
they didn't.
So I go into a school, first

422
00:23:23,000 --> 00:23:27,720
thing I look for is a policy
that's ever evolving.

423
00:23:28,120 --> 00:23:31,320
If they have an issue that
issue, the points they can take

424
00:23:31,320 --> 00:23:33,680
from that are implemented into
the policy.

425
00:23:34,160 --> 00:23:36,400
Now what's a policy?
It is rules.

426
00:23:36,800 --> 00:23:40,880
And we have this in every single
aspect of our life.

427
00:23:41,240 --> 00:23:43,560
The government doesn't even
trust me to drive my car at the

428
00:23:43,560 --> 00:23:45,960
right speed.
They put a, they put a speed

429
00:23:45,960 --> 00:23:48,440
limit in place, it's got a
penalty attached to it.

430
00:23:49,080 --> 00:23:52,720
Policies are essential to let
people know where they stand.

431
00:23:53,240 --> 00:23:58,000
So that's the first thing.
The second thing I've noticed in

432
00:23:58,080 --> 00:24:02,160
the majority of schools I go to
now that one of the rules in

433
00:24:02,160 --> 00:24:06,520
that policy is there is no
access to personal Internet

434
00:24:06,600 --> 00:24:11,600
enabled devices in the school.
This is one of the biggest

435
00:24:11,600 --> 00:24:14,560
things that is going to reduce
risks, right?

436
00:24:14,560 --> 00:24:17,200
Download.
As an undercover detective, and

437
00:24:17,200 --> 00:24:20,920
not everyone agrees with this,
but I can tell you this, I

438
00:24:20,920 --> 00:24:25,280
never, ever understood this.
When I'd go into schools and I'd

439
00:24:25,280 --> 00:24:28,480
see kids walking around on their
own devices, I'd say, oh,

440
00:24:28,480 --> 00:24:30,600
they're allowed to do that.
Oh, yeah, well, they own it.

441
00:24:31,120 --> 00:24:33,080
It's their device.
And you know what I couldn't

442
00:24:33,080 --> 00:24:35,960
understand?
If someone walked up to the

443
00:24:35,960 --> 00:24:39,040
fence at school and called a
student over and said, hey,

444
00:24:39,080 --> 00:24:40,440
what's your name?
Where'd he go to school?

445
00:24:40,440 --> 00:24:42,280
What's your address?
The school probably called the

446
00:24:42,280 --> 00:24:44,800
police because that's not OK,
That's concerning.

447
00:24:45,320 --> 00:24:47,800
But these students were walking
around at school with their own

448
00:24:47,800 --> 00:24:51,240
device and they could be being
groomed by a child sex offender

449
00:24:51,920 --> 00:24:55,760
on the school grounds under the
school's care without the school

450
00:24:55,760 --> 00:24:59,080
knowing those personal devices.
The biggest problem is there's

451
00:24:59,080 --> 00:25:02,000
no accountability.
It is a private world for them.

452
00:25:02,640 --> 00:25:05,720
The other thing I thought you
could have students at school

453
00:25:06,360 --> 00:25:09,640
bullying other students at
school and then that other

454
00:25:09,640 --> 00:25:13,200
student may be so upset they
take their own life or there's a

455
00:25:13,200 --> 00:25:16,080
fight.
I never understood why we as a

456
00:25:16,080 --> 00:25:20,240
community would allow this.
They've said, oh, you know, we

457
00:25:20,240 --> 00:25:22,080
trust the kids.
It's not about trusting our

458
00:25:22,080 --> 00:25:23,720
kids.
It's about trusting the world.

459
00:25:23,720 --> 00:25:27,480
It's about identifying that
sometimes young people will make

460
00:25:27,480 --> 00:25:30,200
mistakes.
Sometimes those mistakes with

461
00:25:30,200 --> 00:25:34,200
technology can be life changing.
And why do we go to school?

462
00:25:34,480 --> 00:25:39,320
It's an educational environment.
Those those phones are personal.

463
00:25:39,600 --> 00:25:42,040
That's what they're used for.
Kids don't generally don't use

464
00:25:42,040 --> 00:25:44,320
those phones to learn.
They use it to have fun.

465
00:25:44,560 --> 00:25:47,200
These are to watch videos, play
games and connect with other

466
00:25:47,200 --> 00:25:51,040
people.
So schools who I see is really

467
00:25:51,040 --> 00:25:55,000
starting to manage technology
are identifying and it's not

468
00:25:55,000 --> 00:25:57,080
easy.
No one likes change.

469
00:25:57,440 --> 00:26:01,080
No one likes to lose something.
And the first couple of years it

470
00:26:01,080 --> 00:26:04,960
can be tricky to say, OK,
there's no personal devices at

471
00:26:04,960 --> 00:26:06,720
school.
How a school chooses to manage

472
00:26:06,720 --> 00:26:09,840
that, I think it's going to be
up to them because every school

473
00:26:09,840 --> 00:26:11,400
is different.
Some schools manage a lot

474
00:26:11,400 --> 00:26:13,840
better.
Some students hand them in, some

475
00:26:13,840 --> 00:26:15,560
students.
Some schools say, OK, they got

476
00:26:15,560 --> 00:26:17,240
to go in your lockers, you're
not allowed to touch them for

477
00:26:17,240 --> 00:26:20,200
the day.
And then I would have a someone

478
00:26:20,200 --> 00:26:21,640
who'd say, oh, but they can go
and get it.

479
00:26:22,280 --> 00:26:24,440
Yeah.
So if they do, when they're

480
00:26:24,440 --> 00:26:26,640
caught, there's a penalty.
Their parents are called, the

481
00:26:26,640 --> 00:26:28,560
phone's taken, they get it back
at the end of the day.

482
00:26:28,720 --> 00:26:31,880
Second offence, the parent has
to come down to the school to

483
00:26:31,880 --> 00:26:34,360
collect the phone.
Third offence, internal

484
00:26:34,360 --> 00:26:37,280
suspension.
So this is how we manage the

485
00:26:37,280 --> 00:26:41,560
world because I say to parents,
when each child goes to bed, you

486
00:26:41,560 --> 00:26:44,000
can't stop them coming out,
taking the car keys and go and

487
00:26:44,000 --> 00:26:47,520
driving the car.
I said you'd get that's their

488
00:26:47,520 --> 00:26:50,400
responsibility.
Yeah, I can do that, but I don't

489
00:26:50,400 --> 00:26:52,800
do that.
And then when we can normalize

490
00:26:52,800 --> 00:26:54,800
that, and it might take a few
years because not everyone's

491
00:26:54,800 --> 00:26:57,160
going to be happy, particularly
if they've been able to use it,

492
00:26:58,880 --> 00:27:01,680
it becomes the norm.
And there'll always be people

493
00:27:01,680 --> 00:27:04,320
who do the wrong thing.
But it's about creating a

494
00:27:04,320 --> 00:27:08,040
positive norm whereby the
majority of their community

495
00:27:08,360 --> 00:27:11,760
complies.
So that's the other thing I've

496
00:27:11,760 --> 00:27:16,400
seen where there's policy, the
policy has to be enforced.

497
00:27:16,800 --> 00:27:18,920
There's only one thing worse
than no policy.

498
00:27:19,080 --> 00:27:20,960
It's having a policy and not
enforcing it.

499
00:27:21,560 --> 00:27:24,840
The next one is putting phones
and personal devices in their

500
00:27:24,840 --> 00:27:26,560
place.
You're here to learn.

501
00:27:26,760 --> 00:27:28,960
You're under our care.
If you have a problem, you come

502
00:27:28,960 --> 00:27:32,080
to us.
It's enumerable.

503
00:27:32,080 --> 00:27:34,480
The amount of issues that
personal devices, of course,

504
00:27:34,480 --> 00:27:38,200
that I've heard the big one that
you as that you as a school

505
00:27:38,240 --> 00:27:41,640
might have identified is that if
a child's having a problem at

506
00:27:41,640 --> 00:27:42,920
school, they don't go to a
teacher.

507
00:27:42,920 --> 00:27:45,720
They ring their parents when the
parent gets on to the school and

508
00:27:45,720 --> 00:27:47,040
says what are you doing?
My child?

509
00:27:47,040 --> 00:27:49,840
This is happening to my child.
So it causes more issues.

510
00:27:50,200 --> 00:27:55,560
So getting those phones away,
trusting instincts, doing

511
00:27:55,560 --> 00:27:58,480
educate our own education in
school with them.

512
00:27:58,520 --> 00:28:01,400
There's so many great programs
out there now because I say to

513
00:28:01,400 --> 00:28:04,880
parents in schools, we don't
tell the kids the good stuff.

514
00:28:05,120 --> 00:28:08,760
No one's going to TikTok,
certainly not teach our kids how

515
00:28:08,760 --> 00:28:12,360
to manage technology.
So look, the other things I

516
00:28:12,360 --> 00:28:15,920
would of identified going into
schools, you know, having a

517
00:28:15,920 --> 00:28:20,800
learning device with management
software so the school knows

518
00:28:20,800 --> 00:28:23,240
what activities happening,
people know that someone's

519
00:28:23,240 --> 00:28:28,520
watching, having that policy,
enforcing the policy, keeping

520
00:28:28,520 --> 00:28:32,520
lines of communication open with
families and knowing that this

521
00:28:32,520 --> 00:28:35,600
is a team effort.
We're all stakeholders here.

522
00:28:37,080 --> 00:28:41,160
So and the education as well and
restorative practices, the

523
00:28:41,160 --> 00:28:46,480
course, our staff identifying,
you know, that issues can be

524
00:28:46,480 --> 00:28:48,280
created with the young, young
person.

525
00:28:48,720 --> 00:28:53,440
And look, one of the big things
I had was, you know, most of

526
00:28:53,440 --> 00:28:56,680
these issues and I'm not sure
exactly how your community there

527
00:28:56,800 --> 00:28:59,600
works, but they're created at
home.

528
00:29:00,280 --> 00:29:03,800
So, you know, how much of A role
should the school play?

529
00:29:04,240 --> 00:29:06,760
The thing is, the issues are
created at home, but generally

530
00:29:06,760 --> 00:29:08,840
they're bought into the school
and it affects the, you know,

531
00:29:08,840 --> 00:29:11,240
the education, the well-being
involved students.

532
00:29:11,240 --> 00:29:15,480
So again, each school needs to
manage it as best suits their

533
00:29:15,480 --> 00:29:17,640
community.
But if they sort of grasp onto

534
00:29:17,640 --> 00:29:23,440
those ideas, you know, I think
we're going a long way to to,

535
00:29:24,000 --> 00:29:26,440
you know, young people being
able to use it in and out of

536
00:29:26,440 --> 00:29:28,200
school.
You see, when they go into the

537
00:29:28,200 --> 00:29:30,440
workplace, you know what we can
teach them as a school.

538
00:29:30,920 --> 00:29:35,040
The internet's got rules.
I don't have to be have the

539
00:29:35,040 --> 00:29:38,160
right to use my personal device
to check my social media while

540
00:29:38,160 --> 00:29:40,560
I'm at work.
We're teaching them these skills

541
00:29:40,800 --> 00:29:45,080
that if I have a problem, these
are the practices in place,

542
00:29:45,840 --> 00:29:48,240
these are the solutions in
place, and I do this and I fix

543
00:29:48,240 --> 00:29:51,760
my problem, never keep it to
myself, report it to a superior.

544
00:29:53,040 --> 00:29:55,920
So I think we're really
benefiting children by not

545
00:29:56,800 --> 00:29:59,160
getting them to think that when
I go into the workplace, I can

546
00:29:59,160 --> 00:30:03,200
do whatever I want on my phone.
So there's a chance I'll have to

547
00:30:03,200 --> 00:30:07,040
sign one of those documents when
they go into the workplace in

548
00:30:07,040 --> 00:30:09,760
relation to I will comply with
the acceptable user.

549
00:30:09,880 --> 00:30:10,760
Yeah.
And you brought up an

550
00:30:10,760 --> 00:30:13,800
interesting point there just
before about partnerships at

551
00:30:13,800 --> 00:30:15,960
home.
And, you know, different

552
00:30:17,600 --> 00:30:19,960
jurisdictions and authorities
around the world as they relate

553
00:30:19,960 --> 00:30:23,480
to education will have different
viewpoints on, you know, in such

554
00:30:23,480 --> 00:30:26,640
a great point you made about
stuff that starts outside of

555
00:30:26,640 --> 00:30:30,600
school is usually brought into
the school as those kids all all

556
00:30:30,600 --> 00:30:34,800
come together.
And, you know, some, some

557
00:30:34,800 --> 00:30:37,640
jurisdictions will say the
school has limitations.

558
00:30:37,640 --> 00:30:41,480
It can't reach into the home and
address issues that have

559
00:30:42,040 --> 00:30:44,960
happened there, even though
they've caused massive issues in

560
00:30:44,960 --> 00:30:47,640
in the school.
So, you know, I think lots of

561
00:30:47,640 --> 00:30:49,840
schools are looking at, well,
how do we educate our parents?

562
00:30:49,840 --> 00:30:52,000
How do we get our parents on
board because we've got the kids

563
00:30:52,000 --> 00:30:54,200
for seven or 8 hours a day.
The rest of the time they're at

564
00:30:54,200 --> 00:30:57,280
home or or wherever else.
What what does?

565
00:30:57,440 --> 00:31:00,120
What are some key things that
parent education looks like in

566
00:31:00,120 --> 00:31:03,360
this space so that everyone's
sort of pushing in the same

567
00:31:03,360 --> 00:31:06,040
direction?
You know what, that, that's a

568
00:31:06,040 --> 00:31:08,600
great point.
That's exactly what I hear from

569
00:31:08,600 --> 00:31:11,240
schools, whether I go anywhere
through Australia, New Zealand

570
00:31:11,240 --> 00:31:14,120
or I've spoken in the States.
This isn't about geographical

571
00:31:14,120 --> 00:31:17,760
locations, about human nature.
It's exact all the same

572
00:31:17,760 --> 00:31:19,920
conversations are being had all
around the world.

573
00:31:20,280 --> 00:31:23,480
Look, you're exactly right in
that once the child walks out of

574
00:31:23,480 --> 00:31:28,520
the school grounds, we don't
have the ability to manage what

575
00:31:28,520 --> 00:31:32,360
parents do.
So I think as much as, and like

576
00:31:32,360 --> 00:31:35,640
you said, you had the kids for
seven hours a day, so they're

577
00:31:35,640 --> 00:31:39,120
the captured audience, they're
the stakeholder that's captured

578
00:31:39,120 --> 00:31:44,240
and they get the education.
We can't do that with parents.

579
00:31:45,080 --> 00:31:49,000
Now, one of my, my, you know,
the biggest things I try to

580
00:31:49,000 --> 00:31:52,800
achieve when I go into a school
community is to start that

581
00:31:52,800 --> 00:31:54,640
education with parents.
And we're not going to get

582
00:31:54,640 --> 00:31:56,720
everybody on board at the same
time.

583
00:31:57,120 --> 00:31:59,880
It's about drip feeding it.
It's about creating a positive

584
00:31:59,880 --> 00:32:02,480
norm.
Parents start to do these things

585
00:32:02,480 --> 00:32:05,760
and they pass those messages on
and it may take a generation or

586
00:32:05,760 --> 00:32:08,800
two.
So the first thing I like to do

587
00:32:08,800 --> 00:32:11,720
is, is really empower parents to
know that their role is

588
00:32:11,720 --> 00:32:14,600
incredibly important.
Sometimes I step back because of

589
00:32:14,600 --> 00:32:17,200
fear.
Sometimes I step back because I

590
00:32:17,200 --> 00:32:18,920
think, I assume they know it
all.

591
00:32:19,240 --> 00:32:21,120
So we want parents to feel
empowered.

592
00:32:22,240 --> 00:32:26,000
So we want them to let us know
it's OK for them to manage

593
00:32:26,000 --> 00:32:28,760
technology in the house.
They don't have to know much

594
00:32:28,760 --> 00:32:31,440
about it, but they're the ones
that have to manage it.

595
00:32:33,000 --> 00:32:36,480
The second thing I'd probably
say is for a parent to set rules

596
00:32:36,480 --> 00:32:38,520
and boundaries.
And if a parent says, well, what

597
00:32:38,520 --> 00:32:42,200
rules and boundaries, what you
could do is give them the school

598
00:32:42,200 --> 00:32:45,680
policy and say, we'll start
here, see how we manage

599
00:32:45,680 --> 00:32:48,000
technology at school.
There's times you can and can't

600
00:32:48,000 --> 00:32:50,040
use technology.
There's programs you can and

601
00:32:50,040 --> 00:32:53,520
can't use.
You know, there's places you can

602
00:32:53,520 --> 00:32:57,560
and can't use technology.
You know, there's the accounts

603
00:32:57,560 --> 00:33:00,160
you can and can't have.
This is what we expect you to

604
00:33:00,160 --> 00:33:02,240
do.
If you have a problem, you don't

605
00:33:02,240 --> 00:33:07,280
hand personal information over.
So we have rules and boundaries

606
00:33:07,280 --> 00:33:09,200
at home.
And what I say to parents is I

607
00:33:09,200 --> 00:33:11,680
don't think there's one thing
you can put in place at home as

608
00:33:11,680 --> 00:33:13,240
it relates to a rule on a
boundary.

609
00:33:13,760 --> 00:33:16,560
Even if it doesn't work, that's
going to be, that's going to

610
00:33:16,560 --> 00:33:19,240
have a negative impact to the
outcome of their life.

611
00:33:19,800 --> 00:33:23,080
Be prepared to make mistakes.
Be prepared that things may not

612
00:33:23,080 --> 00:33:25,960
work.
Be prepared to change the rules

613
00:33:25,960 --> 00:33:28,640
and boundaries as your children
grow because they become

614
00:33:28,640 --> 00:33:33,120
different people.
You know, I've had, I've had

615
00:33:33,120 --> 00:33:37,640
parents who say to me, my child
is struggling online, they're

616
00:33:37,640 --> 00:33:42,920
experiencing severe bullying and
mental health issues and they've

617
00:33:42,920 --> 00:33:46,480
been told by people you can't
take their device away.

618
00:33:47,080 --> 00:33:50,720
That's the child's right to have
it or you never take their

619
00:33:50,720 --> 00:33:54,480
device away.
My response to that is that is

620
00:33:54,480 --> 00:33:58,080
ridiculous.
We are putting technology and

621
00:33:58,080 --> 00:34:00,760
the access to technology above
the health and well-being of

622
00:34:00,760 --> 00:34:04,080
their children.
If a parent decides it's having

623
00:34:04,080 --> 00:34:08,199
a negative effect, remove access
totally.

624
00:34:08,320 --> 00:34:12,520
If you feel that's needed, work
on the problem to fix it.

625
00:34:12,880 --> 00:34:16,719
Make sure you're comfortable,
your kids are OK, Then they're

626
00:34:16,719 --> 00:34:20,280
back on using technology again,
maybe with a few changes.

627
00:34:20,800 --> 00:34:24,400
But never feel that you can't
remove technology if you feel

628
00:34:24,400 --> 00:34:26,440
it's destroying your child or
your family.

629
00:34:27,800 --> 00:34:30,280
Technology is very good at
building a lot of credibility

630
00:34:30,280 --> 00:34:32,159
around itself and that that's
its job.

631
00:34:32,159 --> 00:34:35,520
So we use it by the rules and
the boundaries.

632
00:34:35,520 --> 00:34:40,120
The other thing is this, every
single school I go into uses

633
00:34:40,120 --> 00:34:45,480
some form of management software
to manage what people are doing

634
00:34:45,480 --> 00:34:48,719
online and to see what they're
doing for parents.

635
00:34:48,719 --> 00:34:53,080
It's called parental controls,
empowering parents to know they

636
00:34:53,080 --> 00:34:56,320
have a right to do that, not to
keep their kids off technology,

637
00:34:56,320 --> 00:34:58,680
but to help them manage that
online world.

638
00:34:59,680 --> 00:35:01,520
You know, there's a lot of great
ones out there.

639
00:35:01,520 --> 00:35:04,280
I mean, if a family's got Apple
devices, you've got Apple Family

640
00:35:04,280 --> 00:35:08,600
Sharing, Google has now got some
great free parental management

641
00:35:09,040 --> 00:35:11,960
software.
And what it does, it can limit

642
00:35:11,960 --> 00:35:16,160
time, it can block them out of
particular websites, can let

643
00:35:16,160 --> 00:35:18,760
parents know whether they're
searching up on the wrong sorts

644
00:35:18,760 --> 00:35:22,600
of websites.
So use parental controls.

645
00:35:22,920 --> 00:35:26,680
Now I say this one to parents,
stay current and they go stay

646
00:35:26,680 --> 00:35:28,400
current.
I've still got my Nokia flip

647
00:35:28,400 --> 00:35:31,120
phone from 2002.
What do you mean stay current?

648
00:35:31,480 --> 00:35:33,840
I say, well, it's not about
staying current with technology,

649
00:35:33,960 --> 00:35:38,000
just technology as it relates to
your kids, knowing that those

650
00:35:38,000 --> 00:35:40,720
sorts of programs they're using
websites visiting.

651
00:35:41,200 --> 00:35:42,960
Now they say, well, how do I
stay current?

652
00:35:43,040 --> 00:35:45,560
I say #1 you talk to your kids,
that's how you're going to stay

653
00:35:45,560 --> 00:35:48,080
current.
Say you can do Google searches.

654
00:35:48,360 --> 00:35:51,000
You can talk to the school who
has knowledge of you know what

655
00:35:51,000 --> 00:35:54,360
good and bad programs are.
You can, there's some great, I

656
00:35:54,360 --> 00:35:56,840
say to your parents, your son or
daughter comes home and they

657
00:35:56,840 --> 00:35:59,040
say, can I use this program and
you've never heard of it?

658
00:35:59,560 --> 00:36:03,080
Don't say yes because once it's
in, it's very hard to get out.

659
00:36:03,760 --> 00:36:06,080
Obscure yourself.
Do a Google search.

660
00:36:06,120 --> 00:36:09,200
Everything a parent needs to
know about TikTok and then it

661
00:36:09,200 --> 00:36:11,840
should bring up a lot of
reputable good information

662
00:36:11,840 --> 00:36:14,800
about, you know, how a parent
can manage TikTok or whether

663
00:36:14,800 --> 00:36:16,840
it's good for a child at a
particular age.

664
00:36:17,120 --> 00:36:20,960
But there is one more strategy
that I always pass on to

665
00:36:20,960 --> 00:36:24,040
parents.
It is create a culture of

666
00:36:24,040 --> 00:36:28,000
communication in your home.
Communication is king.

667
00:36:28,720 --> 00:36:31,280
In Australia we have what's
called the E Safety Commission.

668
00:36:31,280 --> 00:36:35,160
It's a federal government body
and their job is to manage

669
00:36:35,600 --> 00:36:38,920
technology.
They are the go to when it comes

670
00:36:38,920 --> 00:36:42,160
to protecting people online.
Their number one message?

671
00:36:43,040 --> 00:36:46,480
Start the conversation.
This is where we identify

672
00:36:46,480 --> 00:36:49,200
whether a child's having a good
time, whether they're

673
00:36:49,200 --> 00:36:51,480
struggling, whether everything's
OK.

674
00:36:51,920 --> 00:36:54,480
It gives them a feeling of
support when they're speaking to

675
00:36:54,520 --> 00:36:57,720
a supportive adult because up
here can be the loneliest place

676
00:36:57,720 --> 00:37:00,160
they'll ever be.
It allows us to identify if

677
00:37:00,160 --> 00:37:02,800
there's a problem.
It allows us to take action to

678
00:37:02,800 --> 00:37:05,560
fix that problem.
Never underestimate the value

679
00:37:05,600 --> 00:37:09,320
and the power of communication.
I've lost count the number of

680
00:37:09,320 --> 00:37:12,440
schools who've said we had a
teacher who asked the child a

681
00:37:12,440 --> 00:37:17,200
question and that child then
disclosed to them what have been

682
00:37:17,200 --> 00:37:21,360
going online, but they weren't
disclosing until they asked that

683
00:37:21,360 --> 00:37:23,800
question.
The teacher identified wasn't

684
00:37:23,800 --> 00:37:27,720
quite right.
So open lines of communication

685
00:37:28,400 --> 00:37:33,800
is, is our best weapon, no
matter who we are when it comes

686
00:37:33,800 --> 00:37:38,360
to being a parent, being a
carer, being someone who's

687
00:37:38,360 --> 00:37:41,040
interested in the, you know, the
help and well-being a children

688
00:37:41,640 --> 00:37:47,280
or school communities, being
teachers, being support staff,

689
00:37:47,360 --> 00:37:50,040
being management.
So it's, it's really the

690
00:37:50,040 --> 00:37:52,840
communication key and that will
never change because this is

691
00:37:52,840 --> 00:37:54,760
about people.
That's why I say to parents, if

692
00:37:54,760 --> 00:37:57,440
you're it's like a bully, that's
not about TikTok, that's about

693
00:37:57,600 --> 00:38:01,680
someone bullying your child.
So yeah, that that's generally

694
00:38:01,680 --> 00:38:05,960
my advice that that that schools
can help empower parents to get

695
00:38:05,960 --> 00:38:07,360
involved.
Once you're involved, they're

696
00:38:07,360 --> 00:38:09,840
glad.
They're happy they got involved.

697
00:38:11,800 --> 00:38:14,120
Now you mentioned the, the
Australian government just

698
00:38:14,120 --> 00:38:18,400
before, so we're we're speaking
now it's January 2026 and

699
00:38:18,400 --> 00:38:21,040
recently the Australian
government's implemented a

700
00:38:21,040 --> 00:38:23,800
social media ban for young
people.

701
00:38:23,800 --> 00:38:27,760
I'm wondering your kind of your
view on that in in in terms of

702
00:38:27,760 --> 00:38:32,760
its potential effectiveness in
terms of addressing the

703
00:38:32,760 --> 00:38:35,680
objectives for which it was
introduced and whether that's

704
00:38:35,680 --> 00:38:39,000
really the reality on the ground
from what you're seeing so far?

705
00:38:39,520 --> 00:38:42,640
Yeah, Kevin, I think you know
what my opinion is.

706
00:38:44,840 --> 00:38:48,520
I think this is a no brainer and
it's long overdue.

707
00:38:49,400 --> 00:38:51,360
I'm going to say a couple of
things before I start.

708
00:38:51,360 --> 00:38:54,920
In my the last 20 years in
Australia, and I believe it's

709
00:38:54,920 --> 00:38:57,760
probably replicated around the
world, we've had children under

710
00:38:57,760 --> 00:39:03,760
the age of 16 harm themselves,
take their own lives, severe

711
00:39:03,760 --> 00:39:06,520
mental health through issues,
through bullying.

712
00:39:07,040 --> 00:39:10,160
We've had children sexually
abused, we've had children

713
00:39:10,160 --> 00:39:16,240
abducted and we've had children
murdered as a result of using

714
00:39:16,240 --> 00:39:19,600
these types of programs.
The government's moving to

715
00:39:19,600 --> 00:39:21,520
restrict kids from until they're
16.

716
00:39:21,520 --> 00:39:23,680
And do you know what we've done
as a community here?

717
00:39:24,000 --> 00:39:27,960
Absolutely nothing.
We've watched it happen for 20

718
00:39:27,960 --> 00:39:32,040
years and done nothing.
Finally, the government has

719
00:39:32,040 --> 00:39:34,240
stepped in and said enough's
enough.

720
00:39:34,920 --> 00:39:37,800
What is best for our children?
What are they going to really

721
00:39:37,800 --> 00:39:41,560
miss out on in their life that's
positive and productive if they

722
00:39:41,560 --> 00:39:44,920
don't have access to these types
of programs?

723
00:39:45,680 --> 00:39:50,440
So there was a lot of media in
relation to this building up to

724
00:39:50,440 --> 00:39:53,040
this wall coming into effect in
December last year.

725
00:39:54,280 --> 00:39:56,520
Do you know since then, I
haven't been back into a school

726
00:39:56,520 --> 00:39:57,800
yet.
That'll happen at the end of

727
00:39:57,800 --> 00:40:01,040
January, early February.
I've heard nothing.

728
00:40:01,920 --> 00:40:04,880
The idea of this is to create a
new norm.

729
00:40:05,400 --> 00:40:07,480
It's not even for this
generation.

730
00:40:08,040 --> 00:40:10,360
It's when the next generation
comes through.

731
00:40:11,120 --> 00:40:13,520
Parents and kids just know well
you don't have those accounts

732
00:40:13,520 --> 00:40:16,560
till you're 16.
And there's millions of other

733
00:40:16,560 --> 00:40:19,280
accounts you can use.
You can play games, you can chat

734
00:40:19,280 --> 00:40:22,200
to your friends through
communication programs, but you

735
00:40:22,200 --> 00:40:25,480
can't have these programs that
have been identified as being

736
00:40:25,480 --> 00:40:28,120
dangerous.
Lots of research was done.

737
00:40:28,120 --> 00:40:32,960
One of those pieces of research
I report relied on was in the UK

738
00:40:32,960 --> 00:40:37,960
where 17,000 young people were
surveyed.

739
00:40:38,000 --> 00:40:42,280
And it was a study that was done
and it found that particularly

740
00:40:42,280 --> 00:40:50,280
boys aged 13 to 15, yeah, 13 to
15, and girls 11 to 13 had

741
00:40:50,280 --> 00:40:54,440
decreased like satisfaction when
they're exposed to these types

742
00:40:54,440 --> 00:40:56,360
of programs.
I can tell you social media

743
00:40:56,360 --> 00:41:00,200
connects your kids with 5
billion people, Every single

744
00:41:00,200 --> 00:41:01,960
adult person.
An issue this world's got to

745
00:41:01,960 --> 00:41:04,560
offer.
I mean, and we've, we wonder why

746
00:41:04,560 --> 00:41:06,840
there was issues.
So look, it's not going to be

747
00:41:06,840 --> 00:41:08,840
easy.
It's not saying we don't trust

748
00:41:08,840 --> 00:41:10,680
kids.
It's not saying we don't trust

749
00:41:10,680 --> 00:41:13,640
the nature of our world.
The companies themselves will

750
00:41:13,720 --> 00:41:17,440
not do it.
Now the big thing that parents

751
00:41:17,440 --> 00:41:19,760
and communities are wondering
how are they going to do it?

752
00:41:20,280 --> 00:41:25,480
The onus is on the social media
companies to know who their

753
00:41:25,480 --> 00:41:28,080
customers are.
They can do this through

754
00:41:28,080 --> 00:41:31,560
algorithms.
So if you know, if someone's got

755
00:41:31,560 --> 00:41:37,520
an account and they got 200
followers and 190 of those

756
00:41:37,520 --> 00:41:41,360
followers are under 16, I would
say that news is under 16 bank.

757
00:41:41,360 --> 00:41:44,280
The account's deleted.
They do have the option to

758
00:41:44,280 --> 00:41:47,160
deactivate it so it can be
reactivated at 16, but that's

759
00:41:47,160 --> 00:41:48,800
going to be up to the company
itself.

760
00:41:49,440 --> 00:41:51,680
So there's algorithms who
they're adding, you know what

761
00:41:51,680 --> 00:41:53,280
they're searching up how they're
behaving.

762
00:41:53,560 --> 00:41:58,000
These companies know who's who.
So they already know who we are.

763
00:41:58,000 --> 00:41:59,600
They know I'm sitting here
talking to you.

764
00:42:00,080 --> 00:42:03,760
They're that's their job is to
profile us, not only who we are,

765
00:42:03,760 --> 00:42:07,040
where we are, but what we want
to eat, when we're going to eat

766
00:42:07,040 --> 00:42:09,960
it.
So they already know, they've

767
00:42:09,960 --> 00:42:12,320
just never bothered to find out
because that's obviously going

768
00:42:12,320 --> 00:42:14,360
to take time.
It's going to erode some of the

769
00:42:14,360 --> 00:42:20,400
trust that their customers have.
So I think it's going to be the

770
00:42:20,400 --> 00:42:25,120
biggest shift, positive shift
when it comes to young people

771
00:42:25,120 --> 00:42:28,080
and technology that we've seen
since the introduction of

772
00:42:28,080 --> 00:42:31,080
technology.
So, again, it's not going to be

773
00:42:31,080 --> 00:42:33,680
perfect.
Can someone get around it?

774
00:42:33,680 --> 00:42:35,560
VPNs, whatever it is.
Yeah, maybe.

775
00:42:35,560 --> 00:42:36,960
But guess what?
None of their friends are going

776
00:42:36,960 --> 00:42:39,880
to be there because I've worked
this year and you're not better

777
00:42:39,880 --> 00:42:42,000
than me.
Human beings, particularly young

778
00:42:42,000 --> 00:42:45,200
people, we like easy.
Things have got to be easy.

779
00:42:45,600 --> 00:42:47,480
We don't want it to be hard or
complicated.

780
00:42:47,840 --> 00:42:50,280
People just give up and they use
something else.

781
00:42:50,520 --> 00:42:52,960
So it's about changing the
culture.

782
00:42:53,440 --> 00:42:58,000
And Jenna, here's one for you.
You'll get this parent who says

783
00:42:58,000 --> 00:43:02,440
it should be my decision whether
my child uses social media or

784
00:43:02,440 --> 00:43:05,680
not.
Are so really is it your choice

785
00:43:05,680 --> 00:43:07,440
whether your child wears a seat
belt or not?

786
00:43:07,880 --> 00:43:10,600
Is it your choice when your
child can legally consume

787
00:43:10,600 --> 00:43:12,880
alcohol?
Is it your choice when your

788
00:43:12,880 --> 00:43:14,320
child can vote?
No.

789
00:43:14,360 --> 00:43:17,640
Our community is said because
this is a young person and

790
00:43:17,640 --> 00:43:20,320
they're growing and they're
developing as a community, we're

791
00:43:20,320 --> 00:43:22,680
going to put boundaries in place
to protect young people.

792
00:43:22,680 --> 00:43:26,720
This is another one of them.
So we we're on to this already.

793
00:43:27,000 --> 00:43:28,640
It's something that we're used
to.

794
00:43:31,080 --> 00:43:33,960
I think it's a good thing at at
this moment.

795
00:43:33,960 --> 00:43:36,800
I haven't checked the website,
but I can tell you that the

796
00:43:36,800 --> 00:43:42,320
sites that have been included
and this list will evolve.

797
00:43:42,600 --> 00:43:45,240
Some may drop off if they change
their functionality.

798
00:43:45,240 --> 00:43:49,040
It's all about how the program
works, the programs they've

799
00:43:49,400 --> 00:43:52,480
looked at, what well are
included.

800
00:43:52,480 --> 00:43:55,400
It's called an age restricted
social media site.

801
00:43:56,880 --> 00:44:01,200
It's really a site where its
primary function, primary

802
00:44:01,200 --> 00:44:05,400
purpose is to connect people
randomly together where they can

803
00:44:05,400 --> 00:44:07,960
interact at a personal level and
share information.

804
00:44:08,640 --> 00:44:13,440
So the sites that have come
under this scope is Facebook to

805
00:44:13,440 --> 00:44:18,400
Graham, Snapchat Talk and
YouTube.

806
00:44:18,800 --> 00:44:20,240
There have been a couple of
others added.

807
00:44:20,280 --> 00:44:23,840
I think there's Reddit, there's
another, there's another couple,

808
00:44:24,640 --> 00:44:28,200
I'm not sure about the country
that you're hosting this podcast

809
00:44:28,200 --> 00:44:30,320
from, but in Australia, they're
not popular with young people.

810
00:44:30,600 --> 00:44:34,000
But the fifth one here that I'm
sure is popular around the world

811
00:44:34,000 --> 00:44:41,840
with Yanka is YouTube.
Now don't think that means that

812
00:44:41,840 --> 00:44:43,880
young people can't watch YouTube
videos.

813
00:44:43,880 --> 00:44:47,000
It means they can't create a
channel or an account in

814
00:44:47,000 --> 00:44:50,520
YouTube.
So that's my belief because it

815
00:44:50,520 --> 00:44:52,760
is still evolving.
It's still being implemented.

816
00:44:53,120 --> 00:44:57,000
I'm very, very interested to get
back to schools too, because the

817
00:44:57,000 --> 00:45:00,000
kids are on holidays here, so,
well, vacation here.

818
00:45:00,520 --> 00:45:04,360
So that they haven't really come
together to discuss this and to

819
00:45:04,360 --> 00:45:07,280
see what's happening and who's
got what and what experiences it

820
00:45:07,280 --> 00:45:08,360
had.
And the school hasn't heard

821
00:45:08,360 --> 00:45:10,560
that.
So I'm really interested to get

822
00:45:10,560 --> 00:45:12,400
back into school this year.
It's going to be very, very

823
00:45:12,400 --> 00:45:14,520
interesting.
But the answer to your question,

824
00:45:15,080 --> 00:45:18,480
it's a godsend.
So all we can do is wait and

825
00:45:18,480 --> 00:45:19,840
speak.
What's going to happen?

826
00:45:20,720 --> 00:45:22,880
Yeah, I, I couldn't agree more,
couldn't agree more.

827
00:45:22,880 --> 00:45:27,080
So just to, to wrap things up
and draw things to a close.

828
00:45:27,080 --> 00:45:30,240
Now, you shared a lot of
information, a lot of, you know,

829
00:45:30,360 --> 00:45:34,160
experience, advice, suggestions.
If there was just one thing, one

830
00:45:34,160 --> 00:45:36,480
main thing that someone
listening to this could take

831
00:45:36,480 --> 00:45:39,880
away, whether that's a, you
know, someone primarily who's

832
00:45:39,880 --> 00:45:42,600
working in education, what's
what's the, the most important

833
00:45:42,600 --> 00:45:45,040
thing you'd like them to take
away from this conversation?

834
00:45:45,040 --> 00:45:49,960
You know what it is technology
and the Internet has opened up

835
00:45:49,960 --> 00:45:53,600
so many great opportunities for
us and most people.

836
00:45:53,600 --> 00:45:56,200
Always remember most people have
a pretty good time online,

837
00:45:56,200 --> 00:46:00,520
including children.
It would be to acknowledge the

838
00:46:00,520 --> 00:46:03,280
dangers of real.
As a carer, I have a right to

839
00:46:03,280 --> 00:46:06,520
get involved and is basically
using the skills I already have,

840
00:46:07,160 --> 00:46:11,320
my instincts, my life skills,
and by keeping talking to young

841
00:46:11,320 --> 00:46:14,600
people about technology and
we're just going to see issues

842
00:46:14,600 --> 00:46:16,800
drop away.
The bad things happen in

843
00:46:16,800 --> 00:46:19,160
private.
So it's all about that

844
00:46:19,160 --> 00:46:21,120
conversation and that
communication.

845
00:46:21,120 --> 00:46:24,080
I'd like to make it a lot more
clever and interesting and

846
00:46:24,080 --> 00:46:26,760
complex, but I really don't
believe that is.

847
00:46:26,800 --> 00:46:28,560
Fantastic.
Well, that's, I think that's a

848
00:46:28,560 --> 00:46:30,960
great place to leave it, Brett.
So look, thanks so much, really

849
00:46:30,960 --> 00:46:33,080
appreciate your time today
sharing your, your, your

850
00:46:33,240 --> 00:46:37,640
knowledge and experience and and
background just so, so valuable

851
00:46:37,640 --> 00:46:39,280
and and interesting.
Thanks so much.

852
00:46:40,080 --> 00:46:42,000
Thank you, Kevin.
It's been an absolute pleasure.

853
00:46:42,000 --> 00:46:45,160
Good luck for 2026 and the same
to all your listeners.

854
00:46:45,440 --> 00:46:47,120
Thanks so much for listening to
the episode.

855
00:46:47,320 --> 00:46:49,400
If you enjoyed this
conversation, don't forget to

856
00:46:49,400 --> 00:46:51,800
subscribe, like, follow, et
cetera.

857
00:46:52,120 --> 00:46:55,080
Drop a comment below to let me
know anything you'd like covered

858
00:46:55,080 --> 00:46:59,960
in upcoming episodes or
suggestions for future guests.

859
00:47:00,480 --> 00:47:03,880
You can also connect with me on
Instagram and LinkedIn.