March 20, 2025

Massive Disruption Coming to the Mining & Investment World (Rusty and Brice)

Today’s episode is a unique conversation with special guests Russell “Rusty” Delroy of Nero Resource Fund & Brice Gower of Augment Technologies.


We explore completely new terrain, exploring how the technological megatrends on foot today (think AI and robotics) could totally upend the mining industry as we know it. And we explore the investment themes that come from that big-picture view. This episode goes deep and we explore some lateral concepts – but trust us, it’s worth it.


AI is the defining technology of our era, and its implications are not limited to the digital world.


Our discussion traverses how nimble governments need to be in this age, how disruption will vary across mineral assets, what consumerism looks like in a cheap metal world and who’ll be first to take advantage of this paradigm shift.


Intro Music: Here Come the Robots by Stoned Jesus

 

Sign-up for the Director’s Special

 

Please read our Privacy Policy and Disclaimer here

 

Discounted tickets available for AUSIMM Underground Operators conference exclusively via Money of Mine. Get $100 off using the code MOM100.

Adelaide, 7 - 9 April 2025 (Matty will be there in full conference mode)

 

Thank you to our Partners:

 

Mineral Mining Services – Your preferred mining contractor

 

https://www.mineralminingservices.com.au/join-us/ - 1300 546 117

 

Grounded - Infrastructure for remote mining and civil projects Australia wide

 

Paul Natoli - pn@groundedgroup.com.au

 

Sandvik Ground Support – The only ground support you’ll ever need

 

http://www.rocktechnology.sandvik/groundsupport

 

CRE Insurance – Insurance Brokers for the Construction, Resources and Energy sectors

 

davidh@creinsurance.com.au - +61 2 9493 6100

 

K-Drill – Safe, reliable, and productive surface RC drilling

 

drew@k-drill.com.au - +61 416 015 876

 

WA Water Bores – WA’s premier water well drilling company

 

James@wawaterbores.com.au - +61 429 695 538

 

Swick - Seventeen million metres drilled. Twenty-plus global sites. Ninety rigs. 

 

rob.burnett@swickmining.com - +61 8 6253 2310

 

Quattro Project Engineering - Global Engineering partnerships delivered with efficiency, innovation and energy

 

info@quattrope.com - +61 8 9373 1140

 

Cross Boundary Energy – Independent Power Producer for the global mining industry

 

tim.taylor@crossboundary.com - +61 466 184 943 


(0:00:00)Introduction


(0:03:16)AI - The big thematic


(0:11:55)What industries are vulnerable


(0:22:45)What does a future mine look like?


(0:41:40)What changes on the mining investment front?


(0:47:40)When did the penny drop for Rusty?


(0:55:55)Model innovations


(0:58:50)The data


(1:06:19)Final thoughts

1
00:00:00,080 --> 00:00:09,270
Let's get spiritual, boys.
I, I, I genuinely believe like

2
00:00:09,270 --> 00:00:12,510
for AI to me, what is it?
What's the most important thing

3
00:00:12,510 --> 00:00:18,270
in human history this is?
You cannot overstate how fucking

4
00:00:18,270 --> 00:00:21,990
mega what is going on it is to
humans.

5
00:00:22,110 --> 00:00:24,710
Our last invention.
Yeah, well, we're about to be

6
00:00:24,710 --> 00:00:27,230
the second most intelligent
species on the planet.

7
00:00:27,230 --> 00:00:33,880
We haven't seen anything like
this, not since World War 2.

8
00:00:34,440 --> 00:00:37,680
The the Cold War was nothing
compared to what this

9
00:00:37,720 --> 00:00:40,280
represents.
They have a very serious

10
00:00:40,280 --> 00:00:57,160
competitor and they're losing.
Roydo Money Miners what a what a

11
00:00:57,200 --> 00:01:00,520
what an intro Rusty Bryce, I'm
very excited to have both of you

12
00:01:00,560 --> 00:01:04,040
in a studio as we've got the the
dulcet tunes of stone Jesus,

13
00:01:04,040 --> 00:01:07,240
here come the robots.
A a timely suggestion, Rusty.

14
00:01:07,760 --> 00:01:11,080
I love it from you and the the
both of you are with us because

15
00:01:11,080 --> 00:01:14,120
you happen to have different
lenses on a topic.

16
00:01:14,120 --> 00:01:16,440
We're very keen to explore and I
reckon our, our warnings to be

17
00:01:16,440 --> 00:01:19,880
really keen to, to explore.
And, and that's, that's here

18
00:01:19,880 --> 00:01:21,680
come the robots.
I mean, we're, we're, we're

19
00:01:21,680 --> 00:01:25,160
looking at the forefront of some
really big kind of technological

20
00:01:25,160 --> 00:01:28,040
changes happening.
And these changes, they happen

21
00:01:28,040 --> 00:01:30,240
in an exponential way.
And before you know it, you

22
00:01:30,240 --> 00:01:33,200
know, the universe looks pretty
different as a result of that.

23
00:01:33,560 --> 00:01:35,400
The, our, our world looks pretty
different.

24
00:01:35,520 --> 00:01:37,640
Mining, mining looks different.
Mining investment landscape

25
00:01:37,640 --> 00:01:40,720
looks different.
So there's three themes that

26
00:01:40,720 --> 00:01:44,200
we're, we're going to explore.
These are three giant growth

27
00:01:44,200 --> 00:01:46,680
megatrends in that tech space.
And we're going to just try and

28
00:01:46,680 --> 00:01:51,880
think like 15 years out from
now, think AI, robotics and

29
00:01:51,880 --> 00:01:54,040
compute efficiency.
Those are the three kind of

30
00:01:54,040 --> 00:01:57,640
growth mega trends going to have
a drastic impact on, on all

31
00:01:57,640 --> 00:02:00,240
parts of our life.
And I think we're going to try

32
00:02:00,240 --> 00:02:03,520
and explore the implications of
that to, to, to the, the world

33
00:02:03,520 --> 00:02:05,840
that, you know, we, we operate
in being the, the mining space

34
00:02:05,840 --> 00:02:10,479
and the, the investment side.
Both Rusty you and Bryce next to

35
00:02:10,479 --> 00:02:13,600
us have been thinking about this
from yet respective lenses.

36
00:02:13,760 --> 00:02:14,880
Thank you so much for joining
us.

37
00:02:15,080 --> 00:02:16,600
Thanks for having us.
Yeah, cheers mate.

38
00:02:16,920 --> 00:02:19,280
I should say should clarify for
the audience.

39
00:02:19,640 --> 00:02:23,040
Don't know either of you, Rusty,
You are a, you know, a legend in

40
00:02:23,040 --> 00:02:26,320
the in the resource fund
management game with with with

41
00:02:26,320 --> 00:02:29,200
performance that makes everyone
kind of just be envious.

42
00:02:29,200 --> 00:02:30,560
Well, that's 50 bucks well
spent.

43
00:02:32,880 --> 00:02:36,680
And you know, one thing I pick
up having known you for, for the

44
00:02:36,680 --> 00:02:39,920
last couple of years is you're,
you're always thinking big

45
00:02:39,920 --> 00:02:41,680
thematics ahead.
You're trying to kind of be

46
00:02:41,680 --> 00:02:44,120
early to the, to the trends and
then you kind of work, work

47
00:02:44,120 --> 00:02:45,800
backwards with your portfolio
allocation.

48
00:02:45,800 --> 00:02:49,600
And I know you've been thinking
about, about these things a lot.

49
00:02:49,920 --> 00:02:52,720
And so we've, we're hoping to
couple you with, with Bryce, who

50
00:02:52,720 --> 00:02:54,760
comes at it from a different
perspective, not the funds

51
00:02:54,760 --> 00:02:57,800
management space, but instead,
you know, he, he's armed and an

52
00:02:57,800 --> 00:03:01,120
entrepreneur in the mining kind
of AI and technology space.

53
00:03:01,120 --> 00:03:05,960
But strangely is also fit for
the conversation because you've

54
00:03:05,960 --> 00:03:07,880
got, you've got, you've had
experience in the robotics,

55
00:03:07,880 --> 00:03:12,360
humanoid robotic side too, with,
you know, a previous iteration

56
00:03:12,360 --> 00:03:15,840
of you if you start up Bryce.
So both of you, I think we're

57
00:03:15,840 --> 00:03:18,240
going to hope you bounce off
each other and explore some of

58
00:03:18,240 --> 00:03:20,920
these topics, the real world.
Yeah, you're in the real world.

59
00:03:20,960 --> 00:03:23,040
Yeah, well, look, when we
started AI mining, that was

60
00:03:23,040 --> 00:03:27,120
definitely a non consensus call.
That was a bad idea pre 2016,

61
00:03:27,120 --> 00:03:29,480
but now it's all the rage and
it's a popular thing.

62
00:03:29,480 --> 00:03:31,880
So yeah, non consensus calls is
the way to do it right?

63
00:03:32,360 --> 00:03:34,880
Get after it.
Do they even call it AI in

64
00:03:34,880 --> 00:03:38,400
mining back then?
Well, yes, it was, it was, yeah,

65
00:03:38,560 --> 00:03:41,080
data science was really,
especially because we like to

66
00:03:41,080 --> 00:03:43,520
work with geologists.
And so for them, a lot of this

67
00:03:43,520 --> 00:03:46,200
stuff was made in the 80s.
This wasn't new at all.

68
00:03:46,200 --> 00:03:49,600
And so the real innovation was
the compute, right?

69
00:03:49,600 --> 00:03:53,400
It actually wasn't AI science.
And we we kind of point to an

70
00:03:53,400 --> 00:03:57,720
innovation in 2016 where Alexnet
was the first machine learning

71
00:03:57,720 --> 00:04:00,600
neural network to be ported onto
a graphics compute card.

72
00:04:00,960 --> 00:04:05,520
So making it work on a GPU
instantly made it 250 times more

73
00:04:05,520 --> 00:04:09,400
compute power applied to the
problem, and from there it's

74
00:04:09,400 --> 00:04:12,120
scaled out of control.
And NVIDIA is what it is now

75
00:04:12,120 --> 00:04:14,800
because you know, that's the
most efficient way.

76
00:04:15,160 --> 00:04:17,920
So whilst we can talk about
software improvements in AI

77
00:04:17,920 --> 00:04:20,279
tech, right, like the the
hardware, the compute is a

78
00:04:20,279 --> 00:04:22,520
massive part of it too.
You don't get the software

79
00:04:22,520 --> 00:04:25,320
without hardware.
We we should just define some

80
00:04:25,320 --> 00:04:27,800
terms just to start with.
Kind of big picture.

81
00:04:28,600 --> 00:04:30,680
What?
What the fuck do we mean by AI?

82
00:04:32,000 --> 00:04:34,400
Well, I think it means it means
a lot of things for different

83
00:04:34,400 --> 00:04:36,080
contexts.
So you, I think you're right,

84
00:04:36,640 --> 00:04:40,920
you could really simply just
call AIA non linear multivariate

85
00:04:41,040 --> 00:04:43,880
model, right?
It's it's not linear, it's not

86
00:04:43,880 --> 00:04:46,240
rigid.
And neural networks are really

87
00:04:46,240 --> 00:04:50,680
common, but LLMS use a
transformer architecture, which

88
00:04:50,680 --> 00:04:53,320
is kind of different.
And in the past, there's a lot

89
00:04:53,320 --> 00:04:55,560
of architectures that have
become not popular now, right?

90
00:04:55,560 --> 00:04:59,400
Like graph networks and LSTMS
used to be all these different

91
00:04:59,400 --> 00:05:02,560
options.
So really as far as we see it

92
00:05:02,560 --> 00:05:06,160
from a commercial landscape and
when we're looking for change, I

93
00:05:06,160 --> 00:05:10,680
think the way to look at it is
that there is now a new level of

94
00:05:10,880 --> 00:05:15,520
software algorithms that can
alley vast compute power to our

95
00:05:15,520 --> 00:05:18,120
problems.
And so part of our thinking

96
00:05:18,120 --> 00:05:22,480
needs to be about what are these
long standing problems that

97
00:05:22,480 --> 00:05:24,920
maybe we've even forgot the
problems and we've just so used

98
00:05:24,920 --> 00:05:28,520
to mitigating and working around
those things that now we can

99
00:05:28,520 --> 00:05:33,200
revisit with this step change in
application of compute power.

100
00:05:33,400 --> 00:05:36,640
I found myself doing that in my
own head with with AI, just

101
00:05:36,640 --> 00:05:38,480
things I've parked.
I'm like, oh, that's too hard.

102
00:05:38,960 --> 00:05:40,040
Oh no, I need to come back to
that.

103
00:05:40,080 --> 00:05:42,160
Because you can think about it
differently.

104
00:05:42,360 --> 00:05:45,720
I mean, I think, like, for Full
disclosure, I'm a fool layman,

105
00:05:45,920 --> 00:05:47,360
right?
This is not my area of

106
00:05:47,360 --> 00:05:50,440
expertise.
It fascinates me enormously.

107
00:05:50,440 --> 00:05:53,960
And I hope this is like the
first of many chats that you

108
00:05:53,960 --> 00:05:57,800
guys do on it because I think
this is the most Seminole thing

109
00:05:57,800 --> 00:06:01,360
we've all seen in our lives.
I genuinely believe, like for

110
00:06:01,440 --> 00:06:04,520
AI, to me, what is it?
It's the most important thing in

111
00:06:04,520 --> 00:06:09,440
human history.
This is you cannot overstate how

112
00:06:09,560 --> 00:06:14,040
fucking mega what is going on it
is to humans.

113
00:06:14,040 --> 00:06:15,520
Our last invention.
It, it, it.

114
00:06:15,800 --> 00:06:17,320
Yeah.
Well, we're about to be the

115
00:06:17,320 --> 00:06:20,560
second most intelligent species
on the planet, and that has

116
00:06:20,560 --> 00:06:23,560
never that.
For all of modern human history,

117
00:06:23,560 --> 00:06:24,560
that has not been the case.
So.

118
00:06:24,720 --> 00:06:26,760
That we know of.
Yeah, well, that we know of.

119
00:06:26,760 --> 00:06:29,840
Yeah, that probably another.
Another track, another show.

120
00:06:29,880 --> 00:06:31,240
Aliens.
At the end, yeah, yeah, yeah,

121
00:06:31,240 --> 00:06:34,520
yeah, yeah.
When so so AI to me is like just

122
00:06:35,520 --> 00:06:38,600
absolutely the most important
topic you can consider right now

123
00:06:38,600 --> 00:06:41,720
as a business person in your
personal life that you can't

124
00:06:41,720 --> 00:06:45,480
understate how mega it is to to
get with the program.

125
00:06:45,480 --> 00:06:48,200
It there will be two types of
people.

126
00:06:48,200 --> 00:06:51,240
I think in in, you know, as
short a time frames, like maybe

127
00:06:51,240 --> 00:06:53,520
three to five years.
There'll be those that worked

128
00:06:53,520 --> 00:06:59,160
out how to utilize AI and you
know, empowered themselves in a

129
00:06:59,160 --> 00:07:01,640
vertical in that sense.
So from underneath themselves,

130
00:07:01,640 --> 00:07:06,280
they then multiplied out there
capacity to do things.

131
00:07:07,160 --> 00:07:09,880
And then there'll be those that
didn't and those that didn't

132
00:07:09,880 --> 00:07:13,920
will, will will by and large be
displaced and they'll have an

133
00:07:13,920 --> 00:07:17,360
opportunity to perhaps catch up,
but you'll be a long way behind

134
00:07:17,360 --> 00:07:21,520
individuals that did.
And I'd and I'd and I think to

135
00:07:21,560 --> 00:07:24,360
think about this, it's it's a
non linear sort of thing.

136
00:07:24,360 --> 00:07:27,040
Like I, I think the example I
gave before and it's probably a

137
00:07:27,040 --> 00:07:28,440
poor one, but I'll, I'll try it
again.

138
00:07:29,880 --> 00:07:32,280
You know, if your mate starts
learning French a year before

139
00:07:32,280 --> 00:07:35,320
you, you can start a year later.
You know, there's things you can

140
00:07:35,320 --> 00:07:37,360
do to catch up.
It's you know, you're, you're

141
00:07:37,360 --> 00:07:40,720
set distance behind him.
It's another human being him.

142
00:07:40,720 --> 00:07:43,040
It's another human being.
You know, you generally learn at

143
00:07:43,040 --> 00:07:46,440
a similar pace, etcetera.
It's pretty fixed.

144
00:07:48,080 --> 00:07:50,480
It's different in AI, the whole
thing.

145
00:07:50,480 --> 00:07:55,560
If you if you make this work,
the ramifications are

146
00:07:55,560 --> 00:08:00,120
exponential.
You can have million people

147
00:08:00,400 --> 00:08:05,880
working underneath an individual
mind and so the sooner you get

148
00:08:05,880 --> 00:08:09,520
with that, the the better
because the the the gap from you

149
00:08:09,520 --> 00:08:13,400
to another human being in terms
of your effective output is

150
00:08:13,520 --> 00:08:16,240
going to be so fucking big.
Put in a grounded camp to get

151
00:08:16,240 --> 00:08:19,560
maximum human output on your on
your mind site.

152
00:08:19,560 --> 00:08:22,600
Sorry to interrupt mate.
Check out check out this bloody

153
00:08:22,600 --> 00:08:25,920
accommodation expansion and and
Wellness centre they've put

154
00:08:25,920 --> 00:08:29,360
together for FM G mate, I'll
tell you what, it's even making

155
00:08:29,360 --> 00:08:32,120
me would not want to drink and
get healthier.

156
00:08:32,360 --> 00:08:35,760
We're block got it bloody got
yoga mats nice stone.

157
00:08:36,000 --> 00:08:41,039
The just good quotes on the wall
of friggin look at the look at

158
00:08:41,039 --> 00:08:43,400
the grass.
Look at the gym mate this.

159
00:08:43,520 --> 00:08:45,760
Work out.
There mate, I want to get

160
00:08:45,760 --> 00:08:48,880
healthy on a mind site buddy.
How good is that?

161
00:08:48,880 --> 00:08:53,440
Facilitated by grounded mate.
They don't even need AI.

162
00:08:53,440 --> 00:08:56,200
They are the AI of
accommodation.

163
00:08:57,240 --> 00:08:58,440
Accommodation.
Abuse.

164
00:08:58,600 --> 00:09:01,160
Intelligence AI.
Let's get back to it.

165
00:09:02,520 --> 00:09:06,000
So that that's a really well
explored philosophy called an

166
00:09:06,000 --> 00:09:09,560
intelligence explosion.
And whilst you can definitely

167
00:09:09,560 --> 00:09:13,760
look at what would happen if AI
was able to iteratively improve

168
00:09:13,760 --> 00:09:16,440
itself, which is one of the
things open AI is doing at the

169
00:09:16,440 --> 00:09:21,000
moment with chat CPT.
But we also don't actually know

170
00:09:21,000 --> 00:09:24,040
what the spectrum of
intelligence looks like, right?

171
00:09:24,040 --> 00:09:26,680
You can look at our human
intelligence compared to a

172
00:09:26,680 --> 00:09:29,920
chicken and say is, is an AI
going to be that much more

173
00:09:29,920 --> 00:09:32,360
intelligent than us?
But it absolutely is

174
00:09:32,560 --> 00:09:36,400
exponential, and we can't really
even fathom what 1000 times more

175
00:09:36,400 --> 00:09:40,120
intelligent really means.
Yeah, yeah, yeah, that, yeah,

176
00:09:40,120 --> 00:09:43,040
there's some fascinating, but I
mean, arguably depends on how

177
00:09:43,040 --> 00:09:46,000
you measure intelligence.
Arguably there's in many ways

178
00:09:46,000 --> 00:09:49,960
it's past most of us already.
I think someone asked a question

179
00:09:50,480 --> 00:09:53,480
like what is the most unique
thing about humans or I can't

180
00:09:53,680 --> 00:09:56,880
remember exactly how it's framed
and it's sort of spat out that

181
00:09:56,880 --> 00:10:00,480
we're all self domesticated.
We we self domesticate

182
00:10:00,480 --> 00:10:03,600
ourselves.
I was like, Oh my God, that's so

183
00:10:03,600 --> 00:10:05,440
insightful.
I can't imagine a human being

184
00:10:05,440 --> 00:10:09,000
ever actually coming up with
that level of reflection.

185
00:10:09,560 --> 00:10:13,000
So yeah, I don't know.
It depends on how you measure

186
00:10:13,000 --> 00:10:14,520
it.
Arguably it's it's already

187
00:10:14,520 --> 00:10:18,480
passed a lot of us.
And the reality is it's going to

188
00:10:18,480 --> 00:10:23,240
get to a level very quickly in
every field of endeavour where

189
00:10:23,240 --> 00:10:26,440
it's as good as the best humans.
On Earth, yeah.

190
00:10:26,440 --> 00:10:29,800
So artificial super intelligence
is a is a good way to define

191
00:10:29,800 --> 00:10:31,520
that, right.
And you know, you can talk to a

192
00:10:31,520 --> 00:10:34,360
lot of LLMS at the moment where
if you're asking narrow in the

193
00:10:34,360 --> 00:10:38,840
field, you kind of are getting
masters to PhD level responses.

194
00:10:39,120 --> 00:10:43,640
But the next step of AGI is
about that overlap of having a

195
00:10:43,640 --> 00:10:47,000
PhD in everything and what can
you put together that these

196
00:10:47,000 --> 00:10:51,360
individual experts never could.
So, yeah, like I, I think that

197
00:10:51,360 --> 00:10:53,320
that is going to be upon us
before we know it.

198
00:10:53,520 --> 00:10:57,160
So a horizon of 15 years is a
probably a really great one to

199
00:10:57,160 --> 00:10:58,680
speculate over.
Yeah.

200
00:10:58,680 --> 00:11:01,640
The interesting development I
think with AI is like what is

201
00:11:01,640 --> 00:11:05,520
the human interface look like?
No doubt that will evolve.

202
00:11:05,520 --> 00:11:07,360
No doubt there will be
components of the human

203
00:11:07,360 --> 00:11:11,680
interface today that also
disappear as as AI gets to

204
00:11:11,680 --> 00:11:14,760
manage AI with agency and
derivatives thereof.

205
00:11:15,360 --> 00:11:20,800
But as far as I've sort of
understood it, the human lens,

206
00:11:21,280 --> 00:11:23,680
certainly here and now, it's
absolutely required.

207
00:11:23,680 --> 00:11:26,720
AI is not perfect.
It makes lots of mistakes.

208
00:11:26,720 --> 00:11:31,200
You need to keep guiding it.
And so we're going to go through

209
00:11:31,200 --> 00:11:35,600
this evolution of the human and
what value we bring to the table

210
00:11:35,840 --> 00:11:38,560
with how to manage AI like that.
I think that's going to be the

211
00:11:38,560 --> 00:11:42,800
measure of how much utility you
as a person has in the future is

212
00:11:42,800 --> 00:11:46,600
how well you can manage AI.
Yeah, and.

213
00:11:46,600 --> 00:11:50,800
And not just manage, but grow
personally with it and use it to

214
00:11:50,960 --> 00:11:53,640
augment your capability.
Augmentation is the right word.

215
00:11:53,720 --> 00:11:55,480
Yeah, yeah, yeah.
This is a.

216
00:11:55,800 --> 00:11:58,720
This is a, a point you make
price and that's the, you know,

217
00:11:59,480 --> 00:12:03,680
the, the AI we, we, we, we all
think of when we had the word AI

218
00:12:03,680 --> 00:12:06,800
is our, our interface with, with
LLMS at this point, because

219
00:12:06,800 --> 00:12:09,080
that's been like incredibly
useful tool for anyone doing any

220
00:12:09,080 --> 00:12:11,720
sort of research and trying to
disseminate their, their

221
00:12:11,720 --> 00:12:14,520
thoughts in a coherent way that
people can explore quickly.

222
00:12:16,240 --> 00:12:19,920
But it's self improving, like
you say, Ross, And and because

223
00:12:19,920 --> 00:12:23,560
it's self improving, what
industries are most vulnerable

224
00:12:23,560 --> 00:12:29,160
to disruption?
Look, I think there are some

225
00:12:29,160 --> 00:12:33,120
really core skills to being a
human and and having

226
00:12:33,120 --> 00:12:38,640
personality, and I like to map
those out as intuition,

227
00:12:39,240 --> 00:12:44,280
hospitality and artistic
expression, right?

228
00:12:44,280 --> 00:12:47,840
Those are things that are
uniquely human and it art isn't

229
00:12:47,840 --> 00:12:50,600
necessarily like songs or or
painting, right?

230
00:12:50,600 --> 00:12:53,680
I think your finance work
picking junior miners, that's an

231
00:12:53,680 --> 00:12:55,680
art, right?
And definitely.

232
00:12:55,720 --> 00:13:01,720
More out than song, we all know.
That so those elements, I think

233
00:13:01,720 --> 00:13:05,080
are intrinsically human roles,
but there's a lot of things that

234
00:13:05,120 --> 00:13:07,840
are actually really much better
for machines and, and those are

235
00:13:07,840 --> 00:13:10,880
things that are monotonous,
highly repetitive things that

236
00:13:10,880 --> 00:13:15,320
need to be very precise, right.
These shaky hands are, are only

237
00:13:15,320 --> 00:13:17,960
good for so much and dangerous,
right.

238
00:13:18,080 --> 00:13:20,560
If, if we can remove a human
from a dangerous role, then

239
00:13:20,560 --> 00:13:24,000
that's a win all round.
So you'll find that there's this

240
00:13:24,000 --> 00:13:27,280
really good overlap of those
skills and, and that really is,

241
00:13:27,440 --> 00:13:30,800
I think where AI is going to
augment a lot of people, right?

242
00:13:31,040 --> 00:13:33,880
And where a human can still have
their intuition.

243
00:13:33,880 --> 00:13:36,120
Where do I wanna dig?
Where does that drill hole go,

244
00:13:36,120 --> 00:13:37,960
right?
What political region do I feel

245
00:13:37,960 --> 00:13:40,960
like I can handle?
And you know, take those

246
00:13:40,960 --> 00:13:45,000
overwhelming driving
preferences, the completion

247
00:13:45,080 --> 00:13:47,840
right is going to be quite
monotonous to go through that

248
00:13:47,840 --> 00:13:50,480
process where you want to be
precise, avoid dangerous work.

249
00:13:50,720 --> 00:13:54,720
You can see robotics and AI
actually allowing a lot of

250
00:13:54,720 --> 00:13:57,000
people this freedom to move
forward.

251
00:13:57,280 --> 00:14:00,520
But what that means for capital
structures and how you find a

252
00:14:00,520 --> 00:14:02,840
lot of this stuff I.
Think I think the answer to that

253
00:14:03,120 --> 00:14:05,480
question is actually everything.
Like literally fucking

254
00:14:05,480 --> 00:14:07,880
everything.
Every single thing is pretty

255
00:14:07,880 --> 00:14:12,400
much going to be impacted by by
AI and and and then it's like

256
00:14:12,800 --> 00:14:15,000
really just like a hit list of
what comes first.

257
00:14:15,400 --> 00:14:19,200
And I think mining and why the
discussion today is so pertinent

258
00:14:19,200 --> 00:14:24,320
is mining makes sense as one of
the priority places to go,

259
00:14:24,640 --> 00:14:28,920
right, Because the human
interface in mining is high risk

260
00:14:29,240 --> 00:14:32,600
and high cost.
So and and it is quite

261
00:14:32,880 --> 00:14:36,080
monotonous.
Yeah, labour intensive or or you

262
00:14:36,080 --> 00:14:39,520
know, Yeah, yeah, I guess so.
Yeah.

263
00:14:39,520 --> 00:14:40,840
Load.
Load to plan, right?

264
00:14:41,120 --> 00:14:43,160
Dig to these lines every time.
That's the job.

265
00:14:43,400 --> 00:14:46,360
Yeah, there's, it's certainly,
it's very well suited to

266
00:14:46,360 --> 00:14:49,840
adoption there, particularly
around cost and risk.

267
00:14:49,840 --> 00:14:52,320
And I think if you're a board
and you're sitting there and

268
00:14:52,320 --> 00:14:55,520
you're, you know, we know how
risk averse that all is, there's

269
00:14:55,520 --> 00:14:58,160
personal liability etcetera.
If you've got a, a place where

270
00:14:58,160 --> 00:15:02,800
you can choose between a machine
versus putting a human at risk,

271
00:15:02,840 --> 00:15:05,720
you're immediately going to move
to the machine as soon as

272
00:15:05,720 --> 00:15:08,200
there's any sort of cost parity
or close enough.

273
00:15:08,840 --> 00:15:11,400
And on top of that, as soon as
there's an economic incentive,

274
00:15:11,400 --> 00:15:13,080
it's not just parity.
It's now like, oh, well,

275
00:15:13,080 --> 00:15:16,120
actually this will work much
more efficiently then.

276
00:15:16,120 --> 00:15:19,560
I think it's like a no brainer.
And, and, and the way you

277
00:15:19,560 --> 00:15:22,080
started the chat today is, is
the key thing because there's

278
00:15:22,080 --> 00:15:23,760
these three things happening at
once.

279
00:15:23,760 --> 00:15:26,400
We've got AI evolving at this
rapid rate.

280
00:15:26,800 --> 00:15:30,760
We've got processing power, one
of a better word as I understand

281
00:15:30,760 --> 00:15:37,640
it, evolving at this rapid rate,
plus a whole different landscape

282
00:15:37,640 --> 00:15:42,760
in processing capacity alongside
it, which is quantum, which is a

283
00:15:42,760 --> 00:15:45,920
whole nother thing, not even a
progression.

284
00:15:45,920 --> 00:15:50,120
It's a whole universe different.
And with those two things are

285
00:15:50,120 --> 00:15:54,760
processing capacity and AI, you
then those are already happening

286
00:15:54,760 --> 00:15:57,440
and they're being applied in the
digital realm.

287
00:15:58,040 --> 00:16:02,160
Those will then transfer and
morph into the physical realm

288
00:16:02,760 --> 00:16:04,480
and they're already happening.
So they're already happening

289
00:16:04,480 --> 00:16:09,280
with robotics, but we'll see
humanoid robotics applying those

290
00:16:09,280 --> 00:16:12,040
two and then all of this
transfers across to the

291
00:16:12,040 --> 00:16:14,920
physical.
And that's when I think it gets

292
00:16:14,920 --> 00:16:21,040
really real for mining, because
that's when you say, well, what,

293
00:16:21,080 --> 00:16:23,960
you know, if particularly like
as an underground ore body, for

294
00:16:23,960 --> 00:16:28,200
example, if I can send machines
in to do what humans were doing,

295
00:16:28,440 --> 00:16:33,040
I don't have to run vent or, you
know, say, say air and cooling.

296
00:16:33,040 --> 00:16:36,600
And I don't, I don't need all
the stabilisation that I

297
00:16:36,600 --> 00:16:39,200
currently require because, OK,
if I detonate 100 grams worth of

298
00:16:39,200 --> 00:16:41,600
robots, who gives a shit?
You know, if I kill a human, it

299
00:16:41,600 --> 00:16:44,280
matters.
So suddenly those risk

300
00:16:44,280 --> 00:16:47,560
tolerances against the cost of
ground support and all of those

301
00:16:47,560 --> 00:16:54,000
things, CapEx on your, on your,
your, your, your front end camps

302
00:16:54,000 --> 00:16:58,200
and all the rest of it, fly in,
fly out, HR, IR, all of these

303
00:16:58,200 --> 00:17:00,760
things lighting you don't you
know these things operate in the

304
00:17:00,760 --> 00:17:05,000
dark, so dark.
Minds, Yeah.

305
00:17:05,000 --> 00:17:06,599
There will be dark minds.
There's already dark factories.

306
00:17:06,599 --> 00:17:09,440
So we're seeing those in China,
like pretty amazing, pretty

307
00:17:09,440 --> 00:17:12,400
dystopian.
But why, You know, lights are a

308
00:17:12,400 --> 00:17:15,599
human necessity.
They're not a these things don't

309
00:17:15,839 --> 00:17:18,839
they don't see, not with, not
with vision like us, I think.

310
00:17:18,839 --> 00:17:21,680
That the other part to dark
minds in the context of

311
00:17:21,680 --> 00:17:24,000
underground, right, is, you
know, something you guys have

312
00:17:24,000 --> 00:17:27,440
covered a lot recently is that
we're running out of resources

313
00:17:27,440 --> 00:17:30,040
at the surface, right?
We've kind of found all at least

314
00:17:30,040 --> 00:17:33,520
the tier one big long life
assets and now we're forced to

315
00:17:33,520 --> 00:17:35,800
explore deeper and deeper
underground.

316
00:17:36,120 --> 00:17:40,520
So I think 1 angle that this may
manifest in, in that 15 year

317
00:17:40,520 --> 00:17:44,200
horizon is that the economics
have to change.

318
00:17:44,200 --> 00:17:47,840
The reason we can't even explore
at that depth is because if you

319
00:17:47,840 --> 00:17:51,640
find something the cost of
ventilation, that deep transfer

320
00:17:51,640 --> 00:17:55,200
haulage, it, it all starts to
just exponentially get more

321
00:17:55,200 --> 00:17:59,200
expensive that you go.
So if you're able to remove

322
00:17:59,200 --> 00:18:02,280
ventilation immediately from the
mine designing CapEx structure,

323
00:18:02,320 --> 00:18:05,360
big cost remove, right?
Remove all of the, the safety,

324
00:18:05,360 --> 00:18:09,760
the shelters, the the O2 sensors
and the camps right now all of a

325
00:18:09,760 --> 00:18:14,160
sudden when you're doing that DF
study, right, you can actually

326
00:18:14,360 --> 00:18:17,040
have a completely different
sustaining cost.

327
00:18:17,360 --> 00:18:20,320
And so then you throw on these
new technologies.

328
00:18:20,480 --> 00:18:24,040
So Muon sensing is a really big
one in underground exploration

329
00:18:24,040 --> 00:18:26,280
at the moment.
It just penetrates further and

330
00:18:26,280 --> 00:18:30,080
sees in a more cost effective
manner is a simple description.

331
00:18:30,600 --> 00:18:33,600
But I, I think that these are
the kind of things that are

332
00:18:33,600 --> 00:18:36,640
going to shape mines that will
be different.

333
00:18:36,960 --> 00:18:40,600
And in a 15 year horizon, I
don't expect that's to what the

334
00:18:40,600 --> 00:18:43,800
mining industry will look like,
but I do expect there to be a

335
00:18:43,800 --> 00:18:48,400
couple innovators out there that
have actually got at least one

336
00:18:48,400 --> 00:18:50,760
dark underground mine working
like that.

337
00:18:50,760 --> 00:18:52,280
I.
Reckon it's going to be much

338
00:18:52,280 --> 00:18:54,080
sooner.
Alright, it's going to, it's,

339
00:18:54,200 --> 00:18:56,560
it's, it's coming so fucking
rapid.

340
00:18:56,800 --> 00:19:00,520
Like, so I think this is going
to blow everyone's brains apart

341
00:19:00,520 --> 00:19:02,840
in the next three years.
Like so rapid.

342
00:19:02,840 --> 00:19:07,600
I mean, Elon's talking and I
think Elon is a guy you have to

343
00:19:07,600 --> 00:19:10,200
listen to, right?
He, he, I think he's the most

344
00:19:10,200 --> 00:19:12,200
important human being.
That's, you know, in my

345
00:19:12,200 --> 00:19:15,240
lifetime, the things he's doing
with governance and, and the

346
00:19:15,240 --> 00:19:17,040
like a whole different topic of
conversation.

347
00:19:17,040 --> 00:19:19,520
But I absolutely, this is a
human.

348
00:19:19,520 --> 00:19:23,520
You have to pay attention to his
ability to think forward is

349
00:19:23,520 --> 00:19:25,080
there's no one else on earth
like him.

350
00:19:25,480 --> 00:19:30,240
And and he is saying 90% of
everything, every good, every

351
00:19:30,240 --> 00:19:34,000
service is fundamentally
changing in terms of a cost

352
00:19:34,000 --> 00:19:37,200
structure, like almost to the
point of there's a front end,

353
00:19:37,200 --> 00:19:41,640
you know, a humanoid has a CapEx
component and ongoing

354
00:19:41,640 --> 00:19:43,640
maintenance and an energy cost
and that's it.

355
00:19:44,360 --> 00:19:49,440
So like, once you apply that to
everything you make, do you know

356
00:19:49,560 --> 00:19:53,800
any product, any service saying
90% of things are effectively

357
00:19:54,480 --> 00:19:57,520
nothing, they don't cost much.
Now that's a fucking mind

358
00:19:57,520 --> 00:20:00,440
blowing place to go.
And I think it's happening.

359
00:20:00,440 --> 00:20:03,200
It's going to happen far quicker
than we than we realized.

360
00:20:03,200 --> 00:20:05,720
These aren't that expensive to
build.

361
00:20:06,360 --> 00:20:08,000
And, and, and that's the bit to
watch.

362
00:20:08,040 --> 00:20:12,560
It's the it's the robotics.
Because the AI and the the, the

363
00:20:12,640 --> 00:20:14,960
processing power, I don't I
don't think there's any argument

364
00:20:14,960 --> 00:20:17,640
on that, that that is it.
It's right in front of you every

365
00:20:17,640 --> 00:20:18,280
week.
It's all.

366
00:20:18,280 --> 00:20:20,120
Theoretically possible today,
right?

367
00:20:20,120 --> 00:20:22,520
But I guess the the mining
industry has kind of got a

368
00:20:22,520 --> 00:20:25,880
reputation for not adopting
technology quickly, right?

369
00:20:25,880 --> 00:20:28,840
Like, you know, you, you kind of
expect an oil and gas company

370
00:20:28,840 --> 00:20:31,040
will give that a go before a
mining company would.

371
00:20:31,360 --> 00:20:34,200
But I absolutely agree with you
on the theoretical level.

372
00:20:34,200 --> 00:20:36,480
I don't think there's many
blockers that mean it's not

373
00:20:36,480 --> 00:20:38,880
possible today.
So what's the catalyst for this

374
00:20:38,880 --> 00:20:39,480
to happen?
It'll be.

375
00:20:39,480 --> 00:20:43,040
The board, it's the board and
it's risk, so it'll be it.

376
00:20:43,320 --> 00:20:45,680
So there's carrot and there's
stick.

377
00:20:45,720 --> 00:20:48,920
And yes, mining companies
respond much more to stick than

378
00:20:48,920 --> 00:20:51,400
they do carrot.
No one wants to take a risk.

379
00:20:51,400 --> 00:20:54,520
It's a big, cumbersome, poorly
run, generally speaking,

380
00:20:54,520 --> 00:20:59,040
bureaucratic organisations.
And you know, why take a risk if

381
00:20:59,040 --> 00:21:01,080
it's only going to result in you
losing your job if it doesn't

382
00:21:01,080 --> 00:21:04,600
work and, you know, no praise if
it does type of thing.

383
00:21:04,600 --> 00:21:06,720
That's kind of the mentality in
big bureaucracies.

384
00:21:07,360 --> 00:21:10,480
And, and, and I think where
that'll change though is at the

385
00:21:10,480 --> 00:21:13,880
board level when they sit down
and they read a board paper that

386
00:21:13,880 --> 00:21:17,480
says I have to choose between
sending a human underground to

387
00:21:17,480 --> 00:21:21,920
do that or a humanoid.
How can you justify something

388
00:21:21,920 --> 00:21:26,800
that is cheaper, more effective,
and critically doesn't put

389
00:21:26,800 --> 00:21:29,600
anyone's life at risk?
That is the rub.

390
00:21:29,680 --> 00:21:32,920
And that is where I think we
will see this move quickly.

391
00:21:34,000 --> 00:21:35,840
Just a quick one, Rusty.
Which country does it happen in

392
00:21:35,840 --> 00:21:39,280
first?
The highest cost countries, the

393
00:21:39,280 --> 00:21:43,000
most so Australia, Canada,
wherever your labor rates are

394
00:21:43,200 --> 00:21:47,640
highest and wherever your
regulations and how it relates

395
00:21:47,640 --> 00:21:51,840
to you as a director in your
personal liability is is most

396
00:21:51,840 --> 00:21:53,280
extreme.
So yeah, Australia.

397
00:21:53,280 --> 00:21:55,600
In that right, we we're
positioned pretty well both for

398
00:21:55,600 --> 00:21:58,280
the innovation.
As a country, we're doing

399
00:21:58,280 --> 00:22:01,200
everything possible to be as
unfucking competitive as we can

400
00:22:01,320 --> 00:22:03,560
right now.
Like the the federal government,

401
00:22:03,840 --> 00:22:06,800
the state, you name it, right,
They are doing everything under

402
00:22:06,800 --> 00:22:08,760
the sun to make us as
uncompetitive as possible.

403
00:22:08,760 --> 00:22:11,360
With energy policy, IR policy.
It is.

404
00:22:11,400 --> 00:22:14,880
It is mind blowing how bad we
are going.

405
00:22:15,480 --> 00:22:20,960
And don't even start about
native title and and the voice

406
00:22:20,960 --> 00:22:22,960
and these these types of things.
Thank God we dodged those

407
00:22:22,960 --> 00:22:26,280
bullets.
But but we are heading

408
00:22:26,280 --> 00:22:28,320
absolutely the wrong way on
every metric.

409
00:22:28,680 --> 00:22:33,000
This will be an antidote to some
of that in in first world mining

410
00:22:33,040 --> 00:22:37,480
and and fuck do we need it.
Let's I just want to we're

411
00:22:37,600 --> 00:22:39,240
getting real like futuristic
here.

412
00:22:39,880 --> 00:22:41,880
What is this?
What is a mine 15 years from

413
00:22:41,880 --> 00:22:43,000
now?
Like look like, what are we?

414
00:22:43,360 --> 00:22:46,240
So it's a dark mine you think is
a this is a possibility.

415
00:22:46,800 --> 00:22:48,440
Humans don't even need to be
there.

416
00:22:49,040 --> 00:22:50,440
Couldn't be anywhere near the
face well.

417
00:22:50,440 --> 00:22:53,880
For underground absolutely
right, so surface mining open

418
00:22:53,880 --> 00:22:57,320
cut mines have a really good
level of autonomy going right

419
00:22:57,320 --> 00:22:59,840
like you can see all the tier
ones have basically got their

420
00:22:59,840 --> 00:23:02,600
haul trucks driving around on
them on their own.

421
00:23:02,960 --> 00:23:06,000
And even then the the light
vehicles have amazing collision

422
00:23:06,000 --> 00:23:09,680
avoidance tech, which is all the
kind of stuff you want to see in

423
00:23:09,680 --> 00:23:12,440
our current state.
But even when that gets to full

424
00:23:12,440 --> 00:23:16,760
autonomy, the design of open cut
mines doesn't really change,

425
00:23:16,760 --> 00:23:18,640
right?
Like it's still quite, it's,

426
00:23:18,920 --> 00:23:21,920
that's the way it is.
So yeah, that the opportunity I

427
00:23:21,920 --> 00:23:25,560
think in surface mining is
really going to be about the

428
00:23:25,560 --> 00:23:29,360
current metal coal factor.
So one of these problems that I

429
00:23:29,360 --> 00:23:32,280
think the industry kind of
forgets about is that when you

430
00:23:32,280 --> 00:23:35,000
look at what's in the reserve
and what's actually making it

431
00:23:35,000 --> 00:23:37,040
into the mill, right?
Not even talking about

432
00:23:37,040 --> 00:23:41,280
processing efficiency
downstream, there's a big gap

433
00:23:41,280 --> 00:23:44,920
that goes missing and people
have those contingency factors

434
00:23:44,920 --> 00:23:48,640
even built into, you know, PFS
studies that they pitch at PDAC

435
00:23:48,640 --> 00:23:51,200
because they know that's what
they've seen in their career.

436
00:23:51,480 --> 00:23:55,880
So as we start to reassess some
of these massive problems, the

437
00:23:55,880 --> 00:23:58,200
industry has just been
mitigating, right?

438
00:23:58,520 --> 00:24:02,160
I I can see the metal core
factor converging to almost

439
00:24:02,280 --> 00:24:05,800
exactly what's in the reserve is
actually making it out the other

440
00:24:05,800 --> 00:24:09,120
side of the plant.
And so that that is really where

441
00:24:09,120 --> 00:24:15,600
surface mining's going to go.
And you know, the OEMs in terms

442
00:24:15,600 --> 00:24:18,760
of that automation have got to
make that cost accessible to the

443
00:24:18,760 --> 00:24:22,440
Tier 2 and three smaller miners
that can't afford that big CapEx

444
00:24:22,440 --> 00:24:24,760
layout.
So that that is a big part of

445
00:24:24,760 --> 00:24:26,920
where the capital structure
coming back to your key point,

446
00:24:26,920 --> 00:24:29,000
right, it's, it's all about
access to capital and cost

447
00:24:29,000 --> 00:24:30,520
effective capital, yeah.
For sure, yeah.

448
00:24:30,520 --> 00:24:33,760
It's going to be fascinating who
wins, who who like it's, it's,

449
00:24:33,920 --> 00:24:36,680
it's all relativity, right, As
to who's a winner and who's a

450
00:24:36,680 --> 00:24:38,680
loser.
And yeah, I'd agree with you.

451
00:24:38,680 --> 00:24:41,800
Like the, and we're all guessing
at the moment, right?

452
00:24:41,800 --> 00:24:44,040
Like this is guesswork at best.
Speculation.

453
00:24:44,040 --> 00:24:47,480
It's speculation.
So but the, yeah, the marginal

454
00:24:47,480 --> 00:24:51,000
return feels like underground.
You're going to get a lot, lot

455
00:24:51,000 --> 00:24:53,920
more dividend than than, you
know, squeezing the lemon in

456
00:24:53,920 --> 00:24:57,160
something that's already big and
semi autonomous and low labor

457
00:24:57,160 --> 00:25:00,400
per unit moved from a tonnage
perspective, you know, with your

458
00:25:00,400 --> 00:25:03,760
big open cuts feels like less
juice to squeeze on that lemon.

459
00:25:04,200 --> 00:25:06,240
So yeah, for sure, it feels like
underground.

460
00:25:07,520 --> 00:25:09,920
It's going to be so fascinating
watching how this plays out,

461
00:25:09,920 --> 00:25:12,320
right?
Because like for physical metal

462
00:25:12,320 --> 00:25:15,480
prices, do they go up or down?
I think there's a debate about

463
00:25:15,480 --> 00:25:18,960
that, like depends on how fast
this all comes in.

464
00:25:20,600 --> 00:25:24,000
There's a lot of deflationary
factors at play here around the

465
00:25:24,000 --> 00:25:27,240
cost.
And and I think if you just step

466
00:25:27,240 --> 00:25:31,000
right back and go big pitch, if
you take what Elon's saying as

467
00:25:31,000 --> 00:25:35,120
as correct, that 90% of
everything goods and services

468
00:25:35,120 --> 00:25:39,840
becomes far cheaper, then humans
have a great history of just

469
00:25:39,840 --> 00:25:43,560
consuming more, right?
I would almost guarantee that is

470
00:25:43,560 --> 00:25:47,400
going to mean we consume more.
That's going to mean we need

471
00:25:47,520 --> 00:25:51,800
more inputs And and so we're
going to need more raw materials

472
00:25:52,080 --> 00:25:56,120
what the cost of producing them.
Those should, theoretically.

473
00:25:56,480 --> 00:26:01,520
Drop but the.
Margin made should stay the same

474
00:26:01,520 --> 00:26:05,560
because at any given point in
time you need an incentive price

475
00:26:06,400 --> 00:26:10,520
off the cost so that the cost
and the, the, the realized

476
00:26:10,520 --> 00:26:12,560
price, the costs aren't the key
factors.

477
00:26:12,560 --> 00:26:14,680
It's the margin, it's the
incentive margin.

478
00:26:15,120 --> 00:26:17,960
So I've been thinking about this
because it's like, oh, do you

479
00:26:17,960 --> 00:26:20,400
buy physical commodities?
I'm not sure about that.

480
00:26:20,400 --> 00:26:23,600
Like I don't, I don't know that
that plays out that well.

481
00:26:23,720 --> 00:26:25,960
The history.
Of of technological innovation

482
00:26:25,960 --> 00:26:29,880
or productivity improvements in
mining just makes more ore

483
00:26:30,000 --> 00:26:33,360
economic we've had we've.
Had 100 years of deflation in,

484
00:26:33,360 --> 00:26:36,440
in, in, in commodities, that's
the reality.

485
00:26:36,560 --> 00:26:38,920
And I think that continues in
real terms, right?

486
00:26:38,920 --> 00:26:43,320
Inflation adjusted and I think
that continues and, but, but

487
00:26:43,760 --> 00:26:46,400
what does it mean?
Something's worth in the ground?

488
00:26:46,760 --> 00:26:49,800
If you think of something, what
something's worth in the ground

489
00:26:49,800 --> 00:26:53,040
is a function of the margin that
it will attain when you pull it

490
00:26:53,040 --> 00:26:55,840
out.
That margin by and large has to

491
00:26:55,840 --> 00:26:58,600
stay somewhat constant.
You know, there'll be, there's

492
00:26:58,600 --> 00:27:00,880
always commodity price swings,
but in in order to get it out of

493
00:27:00,880 --> 00:27:03,720
the ground, there always has to
be that margin, whether it's a

494
00:27:03,720 --> 00:27:09,360
15% ROI, whatever the number is.
So the value of that ore body

495
00:27:09,360 --> 00:27:12,440
shouldn't move too much.
I'm still trying to unless

496
00:27:12,440 --> 00:27:14,200
there's.
That change between open pit and

497
00:27:14,200 --> 00:27:17,960
underground, right, If
underground improves on a cost

498
00:27:17,960 --> 00:27:20,720
basis, like I said as a.
At a relative level, yeah, as as

499
00:27:20,720 --> 00:27:22,760
opposed to absolutely and those.
Ones become more cost

500
00:27:22,760 --> 00:27:25,720
competitive because that's.
That's how a lot of pit design

501
00:27:25,720 --> 00:27:28,600
works at the moment, right where
you look at a resource and your

502
00:27:28,600 --> 00:27:32,000
ideal is that it's it's a
massive open cut mine, because

503
00:27:32,000 --> 00:27:35,120
typically that's where your
scale will allow the lowest all

504
00:27:35,120 --> 00:27:37,600
in sustaining cost.
But if it's just that little bit

505
00:27:37,600 --> 00:27:39,640
below the surface, that's that's
not an option.

506
00:27:39,840 --> 00:27:42,120
And you've got this kind of like
hybrid of, well, we'll do a

507
00:27:42,120 --> 00:27:44,480
small pit and then a portal, and
then we'll make it underground.

508
00:27:44,480 --> 00:27:47,040
Or if it's completely
underground, then it's portal

509
00:27:47,040 --> 00:27:49,920
from day one.
But that's all an economic

510
00:27:49,920 --> 00:27:53,560
decision making factor, right?
If the cost of moving rock

511
00:27:53,800 --> 00:27:57,480
changes, then all those shapes
and the distances of depth,

512
00:27:57,560 --> 00:28:00,960
depth of the ore defines that
design and the effective all in

513
00:28:00,960 --> 00:28:03,480
sustaining cost you can achieve.
And it's like, what are the

514
00:28:03,480 --> 00:28:05,400
constraints in underground
mining?

515
00:28:05,400 --> 00:28:08,400
Often it's how many interactions
are you having?

516
00:28:08,400 --> 00:28:10,320
What's the event?
All of those things kind of

517
00:28:10,320 --> 00:28:14,560
disappear in a world where it's,
yeah, automated and robotic, I

518
00:28:14,640 --> 00:28:16,080
think.
So I think they'll probably come

519
00:28:16,080 --> 00:28:20,960
in and out of your little
infrastructure, you know, you

520
00:28:21,720 --> 00:28:25,520
more efficiently, you know I.
One of the pretty good, but

521
00:28:25,520 --> 00:28:26,280
we're not.
We're not.

522
00:28:26,360 --> 00:28:30,320
I mean, machines can do a lot of
this stuff a lot better than us.

523
00:28:32,800 --> 00:28:34,600
Yeah, I would have thought
they're going to, you know,

524
00:28:34,880 --> 00:28:36,040
going to work it out what I
like.

525
00:28:36,040 --> 00:28:39,000
About your ad point right, is
one of the other pit design

526
00:28:39,000 --> 00:28:42,040
techs we're seeing, which is
less robotics and more just

527
00:28:42,040 --> 00:28:47,040
actual the design of your
haulage fleet is this ultra

528
00:28:47,040 --> 00:28:50,120
light class right?
So a lot of open cut mining went

529
00:28:50,120 --> 00:28:53,920
to ultra heavy class right.
You see these 400 ton capacity

530
00:28:53,920 --> 00:28:57,000
trucks hauling or and it's it's
amazing because you've only got

531
00:28:57,000 --> 00:29:00,000
so many drivers, but you know,
if you've only got three of

532
00:29:00,000 --> 00:29:03,680
those trucks and one goes down,
you've just plus 33% of your

533
00:29:03,680 --> 00:29:05,760
capacity.
Whereas in an ultralight

534
00:29:05,760 --> 00:29:09,600
paradigm, imagine you've got 50
cyber trucks, right?

535
00:29:09,600 --> 00:29:12,720
And the cyber trucks all have
their own tonnage equivalent.

536
00:29:12,720 --> 00:29:15,280
Couldn't agree more that when
one of them breaks down, it

537
00:29:15,280 --> 00:29:17,040
barely affects your capacity,
right?

538
00:29:17,080 --> 00:29:18,880
Look, look.
At look at the changing

539
00:29:18,880 --> 00:29:21,920
landscape in warfare over the
last three years or whatever

540
00:29:21,920 --> 00:29:25,320
it's been now we've we've, we've
Ukan and Russia, right?

541
00:29:25,400 --> 00:29:29,040
All of it has gone micronized.
Everything's gotten smaller.

542
00:29:29,200 --> 00:29:32,360
Everything's gone to drone tech.
So it's going to be small,

543
00:29:32,600 --> 00:29:36,320
lightweight, you know, look at
the insect world and how it

544
00:29:36,320 --> 00:29:38,600
operates.
I think this is where we, we end

545
00:29:38,600 --> 00:29:39,200
up.
This is you.

546
00:29:39,200 --> 00:29:43,160
It'll be once you pull the human
out of it as a as a necessary

547
00:29:43,160 --> 00:29:47,400
input, then I think you can
reshape how you think about

548
00:29:47,400 --> 00:29:51,040
things quite dramatically, yeah.
The the war equation is also

549
00:29:51,040 --> 00:29:54,200
quite troubling, especially if
we're talking 15 years in the

550
00:29:54,200 --> 00:29:57,360
future, right?
I think to to keep it on the

551
00:29:57,360 --> 00:30:00,000
tech end of this discussion,
right, You guys are put in your

552
00:30:00,000 --> 00:30:02,000
newsletter we as acquisition of
Micromine.

553
00:30:02,000 --> 00:30:04,000
Like that was great to see you
guys talking about the mining

554
00:30:04,000 --> 00:30:06,920
tech space.
But I'm not sure if you saw, but

555
00:30:07,320 --> 00:30:11,040
Microwind actually tried to sell
in the past recently to Aspen

556
00:30:11,040 --> 00:30:12,600
Tech and that deal was in the
way.

557
00:30:13,160 --> 00:30:15,680
Yeah, that deal was, you know,
official as far as the party's

558
00:30:15,680 --> 00:30:17,280
concerned.
It was at the governance stage.

559
00:30:17,280 --> 00:30:21,080
But the sanction phrasing is is
about anything that could

560
00:30:21,080 --> 00:30:24,560
support military activity and
obviously mining they just look

561
00:30:24,560 --> 00:30:26,640
at that and they go, yeah,
that's sanctioned.

562
00:30:27,000 --> 00:30:32,480
So it's really interesting to
now think of cybersecurity and

563
00:30:32,480 --> 00:30:37,000
cyber attacks and where that
actually may lead in a tenser

564
00:30:37,000 --> 00:30:40,840
geopolitical environment and the
fact that mining companies in

565
00:30:40,840 --> 00:30:44,440
this 15 years in the future
paradigm, they may actually have

566
00:30:44,440 --> 00:30:48,040
to regularly defend their
operating infrastructure.

567
00:30:48,280 --> 00:30:51,400
So, you know, in our autonomous
ideal, I think remote operated

568
00:30:51,400 --> 00:30:54,720
is still a big part of that.
And so these Irocs, whether it's

569
00:30:54,720 --> 00:30:57,760
autonomous or remotely operated,
right, we're going to still have

570
00:30:57,760 --> 00:31:00,320
to have these IROC centres.
And I love that so much.

571
00:31:00,400 --> 00:31:02,840
Much of the Pilbara is operated
from the Perth CBD, right?

572
00:31:02,840 --> 00:31:06,400
It's, it's showing how Perth is
such a pinnacle of mining tech

573
00:31:06,400 --> 00:31:09,600
at the moment.
But when we evolve these new

574
00:31:09,600 --> 00:31:14,600
texts like quantum, right?
Quantum technology may be seen

575
00:31:14,600 --> 00:31:18,040
as a beautiful lock that's very
secure to encrypt everything and

576
00:31:18,040 --> 00:31:22,160
keep your data safe, but it's
also a lock pick, right?

577
00:31:22,240 --> 00:31:26,480
And the, the current RSA 256
hash encryption will just be

578
00:31:26,480 --> 00:31:30,640
unfolded by quantum tech.
And so it's kind of this an arms

579
00:31:30,640 --> 00:31:33,440
race when one person has a
working quantum computer,

580
00:31:33,600 --> 00:31:34,880
everyone else has got to have
one.

581
00:31:35,160 --> 00:31:37,560
And I think it'll be another one
of those overnight changes,

582
00:31:37,560 --> 00:31:38,720
right?
Nothing's going to happen.

583
00:31:38,920 --> 00:31:41,840
Quantum's kind of still with
fusion in that it's so many

584
00:31:41,840 --> 00:31:44,480
years away that there'll be a
catalyst, right?

585
00:31:44,480 --> 00:31:47,920
And it may be less than 15 years
before you'll see it come into

586
00:31:47,920 --> 00:31:48,880
play.
Yeah.

587
00:31:48,880 --> 00:31:53,200
It's such a yeah.
What's happening with Quantum is

588
00:31:53,200 --> 00:31:55,480
like a whole nother level of
just mind explosion.

589
00:31:55,760 --> 00:31:57,440
Like I still don't understand
it.

590
00:31:57,440 --> 00:31:59,920
I still can't watch a YouTube
video that makes it

591
00:32:00,800 --> 00:32:02,360
comprehensible to.
Me.

592
00:32:02,360 --> 00:32:04,640
It's so you can ask plexity to
I've.

593
00:32:04,680 --> 00:32:07,600
Tried, I've tried so many times.
Does that make it make it

594
00:32:07,600 --> 00:32:11,000
simpler?
So, yeah, but you're right,

595
00:32:11,000 --> 00:32:13,760
there's there's a whole realm
there to unlock even if quantum

596
00:32:13,760 --> 00:32:16,480
doesn't happen.
I think all of this around, you

597
00:32:16,480 --> 00:32:20,320
know, AI processing capacity and
and humanoid robotics, that's

598
00:32:20,320 --> 00:32:23,720
all happening anyway.
Quantum's just another level of

599
00:32:24,480 --> 00:32:26,880
things to consider, which is
pretty psychotic.

600
00:32:26,880 --> 00:32:30,320
Yeah.
So, yeah, it there are so many

601
00:32:30,320 --> 00:32:31,560
big picture things happening all
at once.

602
00:32:31,560 --> 00:32:35,920
The, the, the geostrategic
elements to this are, are really

603
00:32:35,920 --> 00:32:39,360
important to consider.
We're in a multipolar world now.

604
00:32:39,640 --> 00:32:43,320
The everything that's going on
is about the USV, China, we

605
00:32:43,320 --> 00:32:47,640
haven't seen anything like this,
not since World War 2.

606
00:32:47,760 --> 00:32:51,360
The, the the Cold War was
nothing compared to what this

607
00:32:51,360 --> 00:32:53,240
represents.
Russia, I don't, I don't think

608
00:32:53,240 --> 00:32:56,920
was ever really a genuine
competitor to the US.

609
00:32:57,320 --> 00:33:01,240
They have a very serious
competitor and they're losing in

610
00:33:01,240 --> 00:33:05,920
many, many fields.
So, you know, this is a very

611
00:33:06,320 --> 00:33:09,800
different landscape.
I would argue that the interface

612
00:33:09,800 --> 00:33:12,760
with Russia and all those things
are actually derivatives of the

613
00:33:12,760 --> 00:33:16,000
bigger picture play, which is
China V the US.

614
00:33:16,000 --> 00:33:20,200
And there is clearly a
technology arms race that has

615
00:33:20,200 --> 00:33:22,840
been declared.
War has been declared on that

616
00:33:22,840 --> 00:33:26,120
front.
And I think on mineral security

617
00:33:26,120 --> 00:33:29,480
and mineral processing, we are
going to see war declared as

618
00:33:29,480 --> 00:33:30,600
well.
For for what?

619
00:33:30,600 --> 00:33:33,240
A better words without trying to
be, you know, too hyperbolic

620
00:33:33,240 --> 00:33:38,920
about it.
But the, the last, the, the, the

621
00:33:38,920 --> 00:33:45,080
Western liberal sort of groups,
be it EU, be it, you know, the

622
00:33:45,080 --> 00:33:51,080
Democrats within the USI think
they had a very, very poor

623
00:33:51,240 --> 00:33:53,480
policy set to try and take this
on.

624
00:33:53,640 --> 00:33:58,880
The IRA was one of the worst
named, worst executed policy

625
00:33:58,880 --> 00:34:00,560
pieces ever.
It was like a trillion dollars.

626
00:34:00,560 --> 00:34:05,560
And I think China went from like
65% of the refined lithium

627
00:34:05,560 --> 00:34:10,320
market to 70% in the five years.
So it's like you fucking you

628
00:34:10,320 --> 00:34:12,360
spent all that money and you and
you went backwards.

629
00:34:12,440 --> 00:34:16,199
If, if, if the end game was to,
you know, pull China's market

630
00:34:16,199 --> 00:34:19,239
share lower and, and I think
that is the end game.

631
00:34:19,239 --> 00:34:22,480
And, and so I think we're going
to see a lot more radical policy

632
00:34:22,760 --> 00:34:26,080
from, from the Trump side of
things.

633
00:34:26,080 --> 00:34:29,320
And we're, and, and this, this
game is only just begun.

634
00:34:29,320 --> 00:34:35,239
And, and that involves feedstock
security and processing and, and

635
00:34:35,639 --> 00:34:40,360
how that plays out.
Like, you know, there's a lot on

636
00:34:40,360 --> 00:34:42,440
that, but but I would argue
this, if you, if you, again, if

637
00:34:42,440 --> 00:34:45,679
you big picture and you say 90%
of all goods and services

638
00:34:45,679 --> 00:34:48,800
getting a lot cheaper, then
where are you going to put

639
00:34:48,800 --> 00:34:50,840
capital to work?
Well, you're going to chase hard

640
00:34:50,840 --> 00:34:53,080
assets, you're going to chase
property and you're going to

641
00:34:53,080 --> 00:34:56,040
chase minerals.
I, I, I, I, I think you know,

642
00:34:56,040 --> 00:34:57,560
and you're going to chase
collectibles, things that can't

643
00:34:57,560 --> 00:35:00,280
be replicated.
Putting more capital to work

644
00:35:00,560 --> 00:35:04,080
Chase and minerals that's a
bloody exploration playbook The

645
00:35:04,160 --> 00:35:06,240
the K drill can facilitate for
you.

646
00:35:06,920 --> 00:35:09,920
Absolutely it's bloody by
quality sample management, cost

647
00:35:09,920 --> 00:35:13,320
per meter drill, budget
management and just the long

648
00:35:13,320 --> 00:35:20,440
standing exploration drilling
expertise of Ryan O'Sullivan.

649
00:35:20,680 --> 00:35:21,960
Ryan O'Sullivan, a man.
Himself.

650
00:35:22,720 --> 00:35:24,960
Get him up to find some
minerals.

651
00:35:26,240 --> 00:35:29,320
And you'd like?
You to riff, riff off if, if, if

652
00:35:29,480 --> 00:35:32,600
AI and like robotics is in a lot
of ways as long as you have

653
00:35:32,600 --> 00:35:34,880
access to it.
This is a tremendous equalizer

654
00:35:34,880 --> 00:35:36,840
in terms of capability, right?
Yeah.

655
00:35:36,960 --> 00:35:40,040
So what can actually
differentiate the prosperity of

656
00:35:40,040 --> 00:35:43,880
a of a nation is the governance
settings to let that capability

657
00:35:43,880 --> 00:35:45,960
prosper.
Ask the AI to build your lithium

658
00:35:45,960 --> 00:35:47,440
refinery.
Ask the, you know, like what?

659
00:35:47,440 --> 00:35:48,480
Yeah.
But as long as the governance

660
00:35:48,480 --> 00:35:51,880
can enable.
Well, we'll ask AI to fix

661
00:35:51,880 --> 00:35:54,640
governance for stars.
That'd be fucking right.

662
00:35:54,640 --> 00:35:56,600
And I think that's what Togi's
already doing, right?

663
00:35:56,600 --> 00:35:58,400
Like that, that arguably that's
what they're doing.

664
00:35:58,400 --> 00:36:02,360
They're they're going
transparency, awareness, change

665
00:36:02,840 --> 00:36:05,760
and and that has to happen.
I would argue the governance all

666
00:36:05,760 --> 00:36:09,360
over the West, we are hopelessly
governed, hopeless and it's

667
00:36:09,360 --> 00:36:11,720
getting worse.
And, and our competitive race

668
00:36:11,720 --> 00:36:15,640
with China, bizarrely, we're
losing to a quasi communist

669
00:36:15,640 --> 00:36:20,040
structure like that is
unfathomable from a western

670
00:36:20,040 --> 00:36:22,200
mindset.
And, and we're having to eat a

671
00:36:22,200 --> 00:36:23,680
lot of humble pie.
And I think there's a lot of

672
00:36:23,680 --> 00:36:26,160
hubris in the West.
We're struggling to accept this,

673
00:36:26,400 --> 00:36:29,160
to struggle to accept that we're
not the smartest guy in the room

674
00:36:29,640 --> 00:36:34,360
and and they're a non democratic
system is beating us.

675
00:36:34,840 --> 00:36:38,240
We have to get much better.
And I, I think what Elon and

676
00:36:38,240 --> 00:36:42,160
Trump represent is the is the
one chance we have to get much

677
00:36:42,160 --> 00:36:45,880
better and it better happen soon
or we are so far behind and the

678
00:36:45,880 --> 00:36:48,400
EU is already fucking gone.
Like they're so far behind,

679
00:36:48,400 --> 00:36:52,240
they're hopeless.
So sorry, sorry, I know that's a

680
00:36:52,280 --> 00:36:55,360
a long way off topic on AI, but
no, no, that's a lot of these

681
00:36:55,360 --> 00:36:58,760
issues are they're conflating
and they're all happening at the

682
00:36:58,760 --> 00:37:02,000
same time.
And the chaos in markets that

683
00:37:02,000 --> 00:37:04,680
we're seeing at the moment, I
think it's because everybody is

684
00:37:04,680 --> 00:37:07,800
like, wow, this is tectonic.
Like everything is changing

685
00:37:07,800 --> 00:37:11,160
underneath our feet and it has
to the US.

686
00:37:11,160 --> 00:37:15,040
The West has been on a path of
just self destruction.

687
00:37:15,040 --> 00:37:18,000
We have de industrialized.
We've got horrific balance

688
00:37:18,000 --> 00:37:19,680
sheets.
We're spending money.

689
00:37:19,680 --> 00:37:23,520
We don't have Social Security.
We don't have an illusion of

690
00:37:23,520 --> 00:37:26,520
democracy in many respects
under, you know, trench

691
00:37:26,520 --> 00:37:30,000
bureaucrats that don't move,
that aren't accountable, that

692
00:37:30,080 --> 00:37:34,880
there's there's a lot to this
and bizarre that it that the

693
00:37:34,880 --> 00:37:38,320
change agent is is Donald Trump.
I'm happy to admit that that's

694
00:37:38,320 --> 00:37:42,880
an odd an odd character to
choose, but but it is so

695
00:37:42,880 --> 00:37:47,760
necessary.
And you know, fuck, I hope, I

696
00:37:47,760 --> 00:37:50,280
hope, I hope they get it right.
I just hope they they execute

697
00:37:50,280 --> 00:37:52,400
quickly.
We wear the pain.

698
00:37:52,640 --> 00:37:54,720
We've been living this
unsustainable, you know,

699
00:37:54,720 --> 00:38:00,520
government funded GDP growth
between fiscal looseness and

700
00:38:00,520 --> 00:38:03,000
immigration that are both
unsustainable, whether it's

701
00:38:03,000 --> 00:38:04,680
Australia, the US, take your
pick.

702
00:38:05,080 --> 00:38:09,520
These things have fed GD
preprints and false senses of

703
00:38:09,520 --> 00:38:12,080
economic growth.
And I think we're going to wear

704
00:38:12,080 --> 00:38:13,880
this pain quickly.
They're going to do it quickly.

705
00:38:13,880 --> 00:38:15,520
They got a midterm to get ahead
of.

706
00:38:16,200 --> 00:38:18,360
I think that's a good thing.
It feels bad in financial

707
00:38:18,360 --> 00:38:20,600
markets, but I think it is going
to translate eventually to

708
00:38:21,000 --> 00:38:22,360
rotation to capital out of the
US.

709
00:38:22,360 --> 00:38:25,520
You know, 70% of every global
equity dollar sits in the US.

710
00:38:25,800 --> 00:38:28,760
Having some of that go outside
and, and look elsewhere, whether

711
00:38:28,760 --> 00:38:31,920
it's, you know, EM and, or, or
hard assets like that's a good

712
00:38:31,920 --> 00:38:34,480
thing.
So, so, yeah, so sorry, there's

713
00:38:34,480 --> 00:38:37,520
a lot of big picture stuff going
on and all at the same time

714
00:38:37,520 --> 00:38:40,880
we've got, we've got AI playing
out and, and big arms races and,

715
00:38:41,200 --> 00:38:43,320
and the cool thing for humanity,
because that sounds very

716
00:38:43,320 --> 00:38:46,680
stressful and very change
oriented and everyone struggles

717
00:38:46,680 --> 00:38:48,840
with change.
No one, no one goes yay change.

718
00:38:49,320 --> 00:38:52,840
But but the good thing is I
think ultimately this is going

719
00:38:52,840 --> 00:38:56,680
to make us far more productive
humans like, like at a global

720
00:38:56,680 --> 00:38:58,680
level.
And and I don't think anybody

721
00:38:58,680 --> 00:39:01,680
sits here today and goes, oh,
you know, everybody should go

722
00:39:01,680 --> 00:39:04,120
out and cut wheat with a sickle
because that employs more

723
00:39:04,120 --> 00:39:06,480
people, right?
That's not a good idea.

724
00:39:06,480 --> 00:39:09,760
We all know that.
So equally, the idea of AI

725
00:39:10,640 --> 00:39:14,840
displacing jobs is a good thing
as long as we go find other

726
00:39:14,840 --> 00:39:17,120
things to do.
So I, I think there's a lot of

727
00:39:17,120 --> 00:39:20,080
positives in this.
It feels very uncomfortable, but

728
00:39:20,080 --> 00:39:23,160
I think there's a lot to be
super excited about.

729
00:39:23,160 --> 00:39:26,520
And I think AI just, you know,
in the here and now, even

730
00:39:26,520 --> 00:39:29,280
without getting too, you know,
forward-looking, just at a

731
00:39:29,280 --> 00:39:32,080
personal level, I know I've felt
it over the last two months,

732
00:39:32,400 --> 00:39:37,080
just embracing and retraining my
brain and being able to attack

733
00:39:37,080 --> 00:39:40,280
things that are otherwise would
park or put off because I feel

734
00:39:40,280 --> 00:39:43,880
now empowered by this army of
intelligence and processing

735
00:39:43,880 --> 00:39:45,640
speed behind.
Yeah.

736
00:39:46,040 --> 00:39:49,280
And, and you know that that
that's here and now, right now,

737
00:39:49,280 --> 00:39:52,800
right now, today, you as a human
being in your personal and your

738
00:39:52,800 --> 00:39:57,640
professional existence, you can
optimize yourself so you know,

739
00:39:58,160 --> 00:40:01,920
so quickly and and you can be
such a more effective version of

740
00:40:01,920 --> 00:40:04,720
yourself right here, right now.
That's fucking exciting.

741
00:40:04,720 --> 00:40:07,480
Like I I love that.
I don't think there's any point

742
00:40:07,480 --> 00:40:11,160
in getting too scared or or or
or worried about dystopian

743
00:40:11,160 --> 00:40:14,520
outcomes because it's happening
anyway and it may all blow up in

744
00:40:14,520 --> 00:40:16,120
our faces.
This may be more dangerous than

745
00:40:16,120 --> 00:40:17,760
nuclear weapons.
It'd be very well maybe, but

746
00:40:17,760 --> 00:40:21,600
yeah, but.
One of the things that is

747
00:40:21,600 --> 00:40:25,720
interesting about an AI kind of
paradigm when you put it as a

748
00:40:25,720 --> 00:40:29,160
weapon is that, you know, a
nuclear bomb does a lot of

749
00:40:29,160 --> 00:40:32,640
damage and it's a very terrible
event and it's pretty clear who

750
00:40:32,640 --> 00:40:36,120
launched it, right?
Whereas AAI cyber attack can

751
00:40:36,120 --> 00:40:38,680
actually do no damage to assets,
right?

752
00:40:38,680 --> 00:40:41,800
And it's also very difficult to
trace where it came from.

753
00:40:41,800 --> 00:40:43,840
So the threshold physical.
Damage, yeah.

754
00:40:43,960 --> 00:40:47,720
You can essentially turn off an
asset and then maybe use it

755
00:40:47,720 --> 00:40:49,400
later if you like the asset,
right, right.

756
00:40:49,800 --> 00:40:53,560
And I think that's coming into
that cybersecurity point where

757
00:40:53,800 --> 00:40:57,200
we really do need to to take
that into consideration with the

758
00:40:57,200 --> 00:41:00,600
the risks that we're looking at.
And I think there's lots of

759
00:41:00,600 --> 00:41:02,960
opportunities to focus on in the
landscape.

760
00:41:03,280 --> 00:41:06,960
But your point about
democratizing and equalizing

761
00:41:07,360 --> 00:41:10,040
mining proficiency is a really
beautiful one, right?

762
00:41:10,360 --> 00:41:14,840
Like when you're trying to
collect data for geological

763
00:41:14,840 --> 00:41:17,800
observations, you know, you
don't want to be precluded from

764
00:41:17,800 --> 00:41:20,320
using certain instruments and
measurement devices because

765
00:41:20,320 --> 00:41:23,160
that's proprietary and you know,
it's it's some nation is

766
00:41:23,160 --> 00:41:27,200
hoarding that tech, the better
we can come towards the way to

767
00:41:27,200 --> 00:41:31,680
operate a mine has uniform
vendors, very well established

768
00:41:31,680 --> 00:41:35,920
offering to ensure that you can
customize it to that resource

769
00:41:35,920 --> 00:41:38,640
because it's our Earth that's
precious and unique, right?

770
00:41:38,640 --> 00:41:41,840
The way to mine it should just
adapt in terms in terms of the

771
00:41:41,840 --> 00:41:47,160
best science possible.
The implications on this like

772
00:41:47,160 --> 00:41:52,880
pretty substantial technological
seismic shift that is underway

773
00:41:53,720 --> 00:41:57,760
from like an investment lens.
Underground ore bodies become

774
00:41:58,160 --> 00:41:59,680
more attractive on a relative
basis.

775
00:41:59,680 --> 00:42:02,080
There's maybe interesting
optionality there.

776
00:42:03,160 --> 00:42:07,520
What else comes to mind from and
investment plans, Rusty, You

777
00:42:07,520 --> 00:42:10,000
make money as a as a contrarian,
yeah.

778
00:42:11,120 --> 00:42:13,640
Probably a few ways to think
about that one's at the asset

779
00:42:13,640 --> 00:42:16,040
level.
So yeah, for sure, I think

780
00:42:16,040 --> 00:42:20,680
underground V open cut has a,
has a, has an, has a relative

781
00:42:20,680 --> 00:42:26,440
advantage.
Jurisdictional application will

782
00:42:26,440 --> 00:42:29,200
be interesting here.
I can't help but feel like the

783
00:42:29,200 --> 00:42:32,760
first world is going to win on
this one because it is high.

784
00:42:32,880 --> 00:42:35,280
It's a higher cost environment
so it it makes a bigger

785
00:42:35,280 --> 00:42:36,960
difference.
So that's a salient.

786
00:42:36,960 --> 00:42:43,040
Point right because we like take
lithium for example, a huge

787
00:42:43,400 --> 00:42:46,360
thing that kind of has been an
overhang on on that that entire

788
00:42:46,360 --> 00:42:48,680
industry has been the fact that
like there's a lot of near

789
00:42:48,680 --> 00:42:52,760
surface really capital light
like tons that can just kind of

790
00:42:52,760 --> 00:42:56,680
get up like that.
But if we're uncompetitive doing

791
00:42:57,160 --> 00:43:00,520
so like capital light kind of
model in, in, in, in WA because

792
00:43:00,520 --> 00:43:03,080
our labour got so enormously
high, yeah, you need these

793
00:43:03,200 --> 00:43:05,440
higher, higher picks, high CapEx
as a result.

794
00:43:05,440 --> 00:43:09,000
But you have the equalizer, be
it the technological

795
00:43:09,000 --> 00:43:12,360
productivity applied to the
West, where you yeah, easy, easy

796
00:43:12,360 --> 00:43:13,320
winds are gone.
Are it be.

797
00:43:13,320 --> 00:43:15,800
Worth asking Billy Beeman, what
do you thinks of some of this

798
00:43:15,800 --> 00:43:18,240
stuff?
There's an argument here that'll

799
00:43:18,280 --> 00:43:21,920
I know I'll get lots of knocks
for this one, but there's an

800
00:43:21,920 --> 00:43:25,800
argument here at some point
where you say, well, do you even

801
00:43:25,840 --> 00:43:28,680
allow open cut mining?
You know, if you look at

802
00:43:28,680 --> 00:43:33,600
mining's impacts
environmentally, the scar tissue

803
00:43:33,600 --> 00:43:39,360
is the, is the the hole itself
and the waste right next door to

804
00:43:39,360 --> 00:43:40,480
it.
Typically speaking.

805
00:43:40,480 --> 00:43:44,360
And in in big open cut settings,
you know your copper and you

806
00:43:44,360 --> 00:43:47,040
super pits and like they're not
insignificant.

807
00:43:48,240 --> 00:43:51,640
I think mining's reputation gets
like it's overstated what an

808
00:43:51,640 --> 00:43:54,600
impact it has on the planet.
It's about 1% of the continental

809
00:43:54,600 --> 00:43:57,720
landmass is impacted by mining,
50 plus percent impacted by

810
00:43:57,720 --> 00:44:00,200
agriculture.
So I'd say that upfront.

811
00:44:00,440 --> 00:44:02,840
But in terms of getting, you
know, better results as an

812
00:44:02,840 --> 00:44:06,880
industry, I think there'll be a
big argument to say, well, why

813
00:44:06,920 --> 00:44:09,880
if you can suddenly cost
effectively pull this stuff out

814
00:44:09,880 --> 00:44:15,040
from underground and a huge
chunk of the processed waste

815
00:44:15,280 --> 00:44:20,240
then gets fed back in, well,
isn't that going to be a like

816
00:44:20,360 --> 00:44:22,880
significantly better
environmental outcome?

817
00:44:22,880 --> 00:44:25,120
I think that's always kind of
been the Holy Grail of mining.

818
00:44:25,120 --> 00:44:28,040
It's just that it's, it's been
very hard to justify.

819
00:44:28,040 --> 00:44:31,560
And if you look at like an asset
like next Gen's, you know that

820
00:44:31,560 --> 00:44:35,400
that tells you, right, if you've
got the margin, you can do it.

821
00:44:36,200 --> 00:44:38,960
It's just a cost function.
And if suddenly those

822
00:44:38,960 --> 00:44:41,760
underground OPS become way more
cost effective on a relative

823
00:44:41,760 --> 00:44:44,560
basis then that I think that
could be a really great thing.

824
00:44:44,560 --> 00:44:46,680
Like I think we'd all love to
see that.

825
00:44:47,040 --> 00:44:48,560
Nobody would.
Say no to that, right?

826
00:44:48,560 --> 00:44:50,840
Yeah, I.
Wouldn't think so like be so

827
00:44:50,840 --> 00:44:54,480
hard to justify going and
ripping big, you know,

828
00:44:54,480 --> 00:44:58,120
generational scar tissue on the
surface of the Earth if you

829
00:44:58,120 --> 00:45:00,600
don't need to.
I'm not here to throw stones at

830
00:45:00,600 --> 00:45:02,800
that either, right?
That's been a necessary part of

831
00:45:03,200 --> 00:45:05,840
human endeavour.
And as I said, it's it's 1% of

832
00:45:05,840 --> 00:45:07,680
the surface of the Earth that
gets impacted.

833
00:45:07,680 --> 00:45:09,680
So and it's.
A reflection of humans demand.

834
00:45:09,680 --> 00:45:11,960
That's well, right it.
Totally, totally.

835
00:45:11,960 --> 00:45:14,360
And, and you know, those things
will remediate over time and

836
00:45:14,360 --> 00:45:17,200
that will, and you know, the
Earth will do its thing.

837
00:45:17,200 --> 00:45:21,760
You know, in a geological sense,
in any sort of universe sense,

838
00:45:22,360 --> 00:45:24,600
we're irrelevant.
It doesn't matter.

839
00:45:24,600 --> 00:45:27,680
But, but without getting too
esoteric, I think it, it, it,

840
00:45:28,120 --> 00:45:30,080
it's just going to be a far
better outcome.

841
00:45:30,080 --> 00:45:32,320
If you, if you're running
underground mines and you're

842
00:45:32,320 --> 00:45:35,320
putting, you know, a good, a
really high portion of your

843
00:45:35,320 --> 00:45:39,440
waste back into that environment
in a solid manner that's, you

844
00:45:39,440 --> 00:45:43,200
know, it's not water soluble,
etcetera, then then that's going

845
00:45:43,200 --> 00:45:45,960
to be a great outcome.
And, and mining all of a sudden

846
00:45:46,440 --> 00:45:49,800
as an interface at the surface,
which is where people feel it

847
00:45:50,080 --> 00:45:52,600
where like competing land use
issues, whether it's like

848
00:45:52,600 --> 00:45:56,720
tourism or agriculture or
whatever, suddenly there's very

849
00:45:56,720 --> 00:46:00,440
little surface expression
suddenly like, are you really a

850
00:46:00,440 --> 00:46:02,680
competing land use?
You shouldn't be.

851
00:46:03,040 --> 00:46:06,280
So, you know, I feel like social
acceptance towards mining could

852
00:46:06,280 --> 00:46:08,520
get a lot greater.
That would be a great thing too.

853
00:46:08,640 --> 00:46:10,680
Yeah.
And that again, first world

854
00:46:10,920 --> 00:46:13,880
because because that's where
these things get roadblocked the

855
00:46:13,880 --> 00:46:15,080
hardest.
Yeah.

856
00:46:15,160 --> 00:46:16,440
And maybe it speeds things up,
right?

857
00:46:16,440 --> 00:46:19,480
We go back to the government and
getting approvals and all these

858
00:46:19,480 --> 00:46:21,120
things.
And you know, we all know that

859
00:46:21,120 --> 00:46:24,080
it takes 15 plus years in, in
the Western world from discovery

860
00:46:24,080 --> 00:46:26,960
to getting a mine started.
You can cut that down.

861
00:46:27,520 --> 00:46:30,160
It's relevant to everything
we've said, economics, the the

862
00:46:30,160 --> 00:46:32,760
whole suite of it.
Yep, it, it all kind of ties in

863
00:46:32,760 --> 00:46:33,840
there.
So it'd be.

864
00:46:34,920 --> 00:46:37,760
Yeah, if I reckon it like if
you're running a big mining

865
00:46:37,760 --> 00:46:41,600
company today, this should be
your the first thing you talk

866
00:46:41,600 --> 00:46:45,600
about it every quarterly.
I can't see how AI is not the

867
00:46:45,680 --> 00:46:47,760
most important thing you're
discussing.

868
00:46:48,000 --> 00:46:52,200
Like I've said to all my staff
quite brutally, hey, I will make

869
00:46:52,200 --> 00:46:55,840
you redundant before AI does if
you don't do your AI training

870
00:46:55,840 --> 00:46:57,600
every day.
So we have like, and I have

871
00:46:57,600 --> 00:47:01,960
been, I like thinking of pretty
mellow person to work with up

872
00:47:01,960 --> 00:47:04,600
until now.
And, and, but I'm so passionate

873
00:47:04,600 --> 00:47:07,080
about this.
Like if you don't embrace it and

874
00:47:07,080 --> 00:47:09,840
you know, we've got people that
are 70 years old in our

875
00:47:09,840 --> 00:47:15,800
organization won't name anyone
Dougie and, and and you know,

876
00:47:15,800 --> 00:47:19,800
embracing AI and just doing it
day in day is going to

877
00:47:19,800 --> 00:47:23,320
differentiate where you land.
And, and that applies not just

878
00:47:23,320 --> 00:47:25,120
to the individual, but to the
organization.

879
00:47:25,120 --> 00:47:29,200
And big companies need to be
talking about this front and

880
00:47:29,200 --> 00:47:30,840
center.
And and it's going to be

881
00:47:30,840 --> 00:47:34,680
fascinating how these lands,
competitive landscapes evolved

882
00:47:34,680 --> 00:47:39,160
because small teams, single
person teams, small human teams

883
00:47:39,520 --> 00:47:44,680
are going to be so much more
efficient than big bureaucratic

884
00:47:45,200 --> 00:47:48,720
encumbered organizations.
And it's I'm going to love

885
00:47:48,720 --> 00:47:51,480
watching that journey like that
is going to be so fucking cool.

886
00:47:51,480 --> 00:47:53,360
When did?
You come to this realization,

887
00:47:53,360 --> 00:47:55,880
I'm really curious, like when
did when did it hit you that AI

888
00:47:55,880 --> 00:47:58,880
is the thing?
Like before, I rang you guys and

889
00:47:58,880 --> 00:48:01,160
said let's do it, let's do a
piece, come on.

890
00:48:03,040 --> 00:48:06,280
It's certainly over the
Christmas break.

891
00:48:06,280 --> 00:48:09,680
I had the, you know, I feel like
Christmas is just such a great

892
00:48:09,680 --> 00:48:13,320
time of the year to just sit
back and watch as much cricket

893
00:48:13,320 --> 00:48:18,960
as possible and just clear the
brain and just think as

894
00:48:18,960 --> 00:48:22,680
laterally as you can.
And I, I do that in a few

895
00:48:22,680 --> 00:48:26,800
different ways, but but it was
probably over.

896
00:48:26,800 --> 00:48:28,960
It had been simmering away for
probably six months.

897
00:48:28,960 --> 00:48:31,360
And then at that Christmas
break, I felt like I had time to

898
00:48:31,360 --> 00:48:34,400
crystallize a bunch of things.
And as it turned out, you know,

899
00:48:34,600 --> 00:48:38,200
the, the, the team, Dan and, and
James in particular came back

900
00:48:38,200 --> 00:48:40,840
from the break with the same
landings, which I, I was stoked

901
00:48:40,840 --> 00:48:42,280
about.
So I was like, hey, guys, this

902
00:48:42,280 --> 00:48:44,560
is, this is where my head's at.
This is how I'm seeing things.

903
00:48:44,840 --> 00:48:46,800
Do you think, am I off pierced
here?

904
00:48:47,240 --> 00:48:49,040
And, and, and yeah, it wasn't
the case.

905
00:48:49,040 --> 00:48:52,680
And I think, you know, we have
for us as an organization, you

906
00:48:52,680 --> 00:48:55,280
know, we're doing an enormous
amount of work at the individual

907
00:48:55,280 --> 00:48:56,640
level.
So like mandatory daily

908
00:48:56,640 --> 00:48:59,800
training, like we are all
starting just grassroots, you

909
00:48:59,800 --> 00:49:03,880
know, the, the Google online
academic sort of platform, just

910
00:49:04,200 --> 00:49:07,160
get the basics right, build the
foundation at an individual

911
00:49:07,160 --> 00:49:08,840
level.
And you know, James is ex

912
00:49:08,840 --> 00:49:10,760
Goldman Sachs data analytics.
So he's at a different level,

913
00:49:10,760 --> 00:49:13,000
but we're all working together
at the basics.

914
00:49:13,000 --> 00:49:16,560
And then we'll find, I think
routes where we can specialize

915
00:49:16,560 --> 00:49:20,400
in it and, and training areas
that are going to help us within

916
00:49:20,440 --> 00:49:24,120
our own specializations.
But but yeah, you know, as an,

917
00:49:25,160 --> 00:49:27,200
as an organization, that's where
we've started at the individual

918
00:49:27,200 --> 00:49:28,280
level.
And then we've done a huge

919
00:49:28,280 --> 00:49:32,640
amount of mapping around our
systems and processes and trying

920
00:49:32,640 --> 00:49:38,320
to prioritize where it is.
We embrace AI and spend time and

921
00:49:38,320 --> 00:49:44,280
some capital to, to make that
happen as quickly as we can.

922
00:49:44,840 --> 00:49:46,960
And it's so we're already using
it.

923
00:49:48,120 --> 00:49:54,360
And, and specifically, this is
around real time data analytics,

924
00:49:54,360 --> 00:49:56,920
so to speak.
So it's, and I'll, I'll use

925
00:49:56,920 --> 00:50:01,240
words wrong and probably get
lots of comments about this, the

926
00:50:01,240 --> 00:50:02,760
vernacular.
I'm sure I'll, I'll cook it.

927
00:50:02,760 --> 00:50:07,640
But, but, but the, the, the
premise is like our job is

928
00:50:07,640 --> 00:50:10,400
pretty simple, right?
We have a, a chunk of capital

929
00:50:10,400 --> 00:50:14,680
to, to invest at any given point
in time and we ask 2 pertinent

930
00:50:14,680 --> 00:50:19,120
questions through the day.
And that is, first of all, do we

931
00:50:19,120 --> 00:50:23,360
own something that we no longer
should or And and then secondly,

932
00:50:23,480 --> 00:50:25,720
because the first thing you want
to do is look at your portfolio.

933
00:50:25,960 --> 00:50:27,440
That's the that's that's the be
all.

934
00:50:27,440 --> 00:50:29,680
The rest of it's just noise.
It's what you own that matters.

935
00:50:29,840 --> 00:50:31,480
So first of all, do you own
something that you no longer

936
00:50:31,480 --> 00:50:33,640
should?
And then secondly, is it

937
00:50:33,640 --> 00:50:35,920
something that you don't own
that you now should?

938
00:50:36,400 --> 00:50:38,040
And that's how we're thinking
about it.

939
00:50:38,040 --> 00:50:40,280
And that's very simple, but it
needs to happen in real time.

940
00:50:40,680 --> 00:50:44,360
And so where we're trying to get
to is like this army of

941
00:50:44,360 --> 00:50:47,960
analysis, if you like, that is
in real time telling us we've

942
00:50:48,960 --> 00:50:53,440
with a different lens.
So you can conceptualize an

943
00:50:53,440 --> 00:50:55,440
investment model in many
different ways.

944
00:50:55,640 --> 00:50:58,320
And I would say in different
macroeconomic climates and

945
00:50:58,320 --> 00:51:03,160
commodity climates and market
climates, different models might

946
00:51:03,160 --> 00:51:05,200
work.
Like for example, in a, in a

947
00:51:05,200 --> 00:51:11,040
fizzy bull market where you
know, uranium is 150 a pound and

948
00:51:11,080 --> 00:51:14,200
any urine, you know, you're
going to just chase any deal you

949
00:51:14,200 --> 00:51:15,760
can.
So you're going to optimize for

950
00:51:15,760 --> 00:51:19,080
like access to deal flow.
And that's a very unique, you

951
00:51:19,080 --> 00:51:22,960
know, that's not my DNA.
But at a certain point in time

952
00:51:22,960 --> 00:51:25,360
that might make sense.
And there'll be a lens for that.

953
00:51:27,360 --> 00:51:31,640
Then then, you know, our DNA is,
is more in deep value kind of

954
00:51:31,640 --> 00:51:34,520
cyclically looking at things.
So tell me something that's deep

955
00:51:34,520 --> 00:51:37,040
value as we perceive deep value
to be.

956
00:51:37,040 --> 00:51:40,720
So let's define that.
That's like, you know, pristine

957
00:51:40,720 --> 00:51:43,920
balance sheets, ideally the
commodities trading into the

958
00:51:43,920 --> 00:51:46,440
cost curve because that's when
we tend to find associated low

959
00:51:46,440 --> 00:51:49,080
valuations, prove that we have
low valuation.

960
00:51:49,080 --> 00:51:52,880
Show me, you know, PES or
whatever the case may be.

961
00:51:52,880 --> 00:51:56,120
Ideally they're also in AQ1 or
Q2 or the cost curve.

962
00:51:56,120 --> 00:51:57,960
So they're bulletproof if this
gets worse before it gets

963
00:51:57,960 --> 00:51:59,600
better.
So it's like defining those

964
00:51:59,600 --> 00:52:04,040
characteristics that, that that
are a model and then trying to

965
00:52:04,480 --> 00:52:07,000
sort of press the button at any
given point in time and say that

966
00:52:07,000 --> 00:52:08,960
model, that model and that
model, I think are relevant in

967
00:52:08,960 --> 00:52:11,880
this climate.
And tell me all day, every

968
00:52:11,880 --> 00:52:16,400
fucking day, wherever, whenever,
when something is, is of

969
00:52:16,400 --> 00:52:17,960
interest.
And that's a sort of screening

970
00:52:17,960 --> 00:52:20,840
process.
And, and then it still requires

971
00:52:20,840 --> 00:52:23,160
the human interface to check
that screening and make sure

972
00:52:23,160 --> 00:52:25,120
that that's done the way you
think it should be.

973
00:52:25,120 --> 00:52:27,000
And it's been, you know, it's
got the right answers.

974
00:52:27,200 --> 00:52:29,800
And then you go get your your
three best ideas and then go

975
00:52:29,800 --> 00:52:33,280
have the bottom up, you know,
workflow where you start with

976
00:52:33,280 --> 00:52:36,880
three pieces of analysis from
from from research houses or or,

977
00:52:37,040 --> 00:52:41,000
you know, brokers sell side
shortcut your way around that,

978
00:52:41,000 --> 00:52:42,520
that work stream as much as
possible.

979
00:52:43,080 --> 00:52:45,440
Chuck that into your own model.
Have the meeting with

980
00:52:45,440 --> 00:52:46,160
management.
Do it.

981
00:52:46,160 --> 00:52:50,320
You know, do that ground up
work, but do that.

982
00:52:50,320 --> 00:52:54,280
But but critically, shortcut
your way to that list of three

983
00:52:54,280 --> 00:52:58,680
names really fucking quickly on
like a daily basis that.

984
00:52:58,760 --> 00:53:02,720
That perspective reminds me so
much of Stanley Druckenmiller.

985
00:53:02,720 --> 00:53:04,840
Absolutely.
You know, legend in all respects

986
00:53:04,840 --> 00:53:10,600
in our investment game and
clearly traded the whole AI

987
00:53:10,600 --> 00:53:14,320
automatic phenomenally himself.
But he talks about, you know, in

988
00:53:14,320 --> 00:53:17,320
a recent interview that the
ability to integrate AI into

989
00:53:17,320 --> 00:53:19,320
decision making as an investment
manager, anything it's like the

990
00:53:19,320 --> 00:53:22,560
ultimate, the ultimate
investment decision making.

991
00:53:22,560 --> 00:53:24,920
It will happen with the
augmentation.

992
00:53:24,920 --> 00:53:28,280
It's augmentation for sure over
the intuition that comes with,

993
00:53:29,080 --> 00:53:30,600
you know the the reps like this
is.

994
00:53:30,600 --> 00:53:33,120
Fascinating, right?
And, and where I think, you

995
00:53:33,120 --> 00:53:35,280
know, getting with the program
so important because there is a

996
00:53:35,280 --> 00:53:39,040
role for humans here.
We will be displaced if we don't

997
00:53:39,160 --> 00:53:45,160
embrace the augmented AI.
But but there is a role and, and

998
00:53:45,160 --> 00:53:50,400
I think AI is a, is a construct
of us and it's a reflection of

999
00:53:50,400 --> 00:53:53,640
us in many ways.
And it works within certainly as

1000
00:53:53,640 --> 00:53:55,120
I've understood.
Tell me if I get this wrong, but

1001
00:53:55,120 --> 00:53:57,640
within the language models it
works within a set of data.

1002
00:53:58,200 --> 00:54:00,960
Here's a reflection of
everybody's thoughts, so to

1003
00:54:00,960 --> 00:54:03,160
speak.
Yeah, one, one of the earlier

1004
00:54:03,400 --> 00:54:05,920
ChatGPT bots was basically just
Reddit, right?

1005
00:54:05,920 --> 00:54:09,720
And Reddit had some real curly
opinions in dark corners that

1006
00:54:09,800 --> 00:54:11,480
would come out from time to
time, right?

1007
00:54:11,480 --> 00:54:16,280
So a big part of the modern LLMS
is something called human

1008
00:54:16,280 --> 00:54:19,360
feedback reinforcement learning.
And it's where they've kind of

1009
00:54:19,360 --> 00:54:23,720
really essentially guided it to
respond in a certain way, you

1010
00:54:23,720 --> 00:54:26,120
know, the way that it'll repeat
back the question and.

1011
00:54:26,120 --> 00:54:28,800
And you know, yeah.
So interestingly, like what

1012
00:54:28,800 --> 00:54:32,720
we're seeing at the moment in,
in, in our experience is like,

1013
00:54:33,240 --> 00:54:37,120
it's not, I don't think it's in
a place where it can think

1014
00:54:37,120 --> 00:54:41,160
counter cyclically because it's,
it's, it's digesting what's out

1015
00:54:41,160 --> 00:54:44,520
there that's a reflection of the
other human thoughts.

1016
00:54:44,520 --> 00:54:47,000
So let's say everybody hates
PGMS at the moment.

1017
00:54:47,000 --> 00:54:50,880
There's any amount of analysis
that says PGM sucks, you go put

1018
00:54:50,880 --> 00:54:53,960
your AI through that and it's
just going to reflect that.

1019
00:54:54,360 --> 00:54:58,000
So in many ways, I think AI at
the minute is kind of a

1020
00:54:58,000 --> 00:55:01,920
reflection of consensus if
that's the thing you're trying

1021
00:55:01,920 --> 00:55:06,160
to do with the AI.
So I it's helpful, but it just

1022
00:55:06,840 --> 00:55:08,840
boils down consent.
It doesn't help you think

1023
00:55:08,840 --> 00:55:11,080
counter intuitively or counter
cyclically.

1024
00:55:11,440 --> 00:55:14,160
And I think that's where the
human interface plays a real

1025
00:55:14,160 --> 00:55:16,040
role.
And if you can then, yeah, guide

1026
00:55:16,040 --> 00:55:18,600
it in that direction or, or just
do it yourself and say, well, I

1027
00:55:18,600 --> 00:55:19,880
don't give a fuck if you think
that.

1028
00:55:20,040 --> 00:55:22,480
Like just go and give me the
best name you.

1029
00:55:22,600 --> 00:55:26,200
You give it the data of how
you've invested historically and

1030
00:55:26,200 --> 00:55:28,840
maybe it can understand your,
your style that.

1031
00:55:28,840 --> 00:55:31,560
How that that's really extreme
where where I've been asking for

1032
00:55:31,560 --> 00:55:34,200
for nearly a year, Yeah.
And we haven't been able to get

1033
00:55:34,200 --> 00:55:35,320
there yet.
I think it's a.

1034
00:55:35,800 --> 00:55:38,800
It's not as easy as it sounds
like I just keep banging the

1035
00:55:38,800 --> 00:55:41,120
table and saying why can't I
just Chuck in the?

1036
00:55:41,960 --> 00:55:43,760
Yeah, you know.
The spreadsheet with all the you

1037
00:55:43,760 --> 00:55:47,000
know, it should go correlate
what were the commodity prices

1038
00:55:47,000 --> 00:55:48,880
at the Nero.
That's definitely you know

1039
00:55:48,920 --> 00:55:51,000
that's your one that's in our
work stream.

1040
00:55:51,000 --> 00:55:55,520
Yeah, in our sort of checklist
of things we want to get done,

1041
00:55:55,520 --> 00:55:56,960
that's that's right up there.
So this.

1042
00:55:56,960 --> 00:55:58,800
Is this is part of the
innovations that I'm seeing in

1043
00:55:58,800 --> 00:56:00,760
our current moment that are
really exciting.

1044
00:56:00,760 --> 00:56:03,840
So one of the more recent papers
is about a model consensus

1045
00:56:03,840 --> 00:56:08,240
protocol and this is essentially
this protocol for how large

1046
00:56:08,240 --> 00:56:11,040
language models in particular
should be accessing certain

1047
00:56:11,040 --> 00:56:12,920
platforms.
So you should see this

1048
00:56:12,920 --> 00:56:17,320
integration on, you know, maybe
Slack or Notion or any of your

1049
00:56:17,320 --> 00:56:20,040
other big apps.
But the purpose of it grander is

1050
00:56:20,040 --> 00:56:25,280
to go to those on Prem databases
where like a drill hole analysis

1051
00:56:25,280 --> 00:56:28,520
database would be a good example
in mining where you want the

1052
00:56:28,520 --> 00:56:31,600
mining company to have their own
LLM that's still got a lot of

1053
00:56:31,600 --> 00:56:35,040
the language and skills from the
base level, but it can access

1054
00:56:35,040 --> 00:56:37,720
your data.
And if you can imagine

1055
00:56:37,720 --> 00:56:41,280
integrating that with a data
lake where the mine plan is

1056
00:56:41,280 --> 00:56:44,160
there, where you've got all the
operating specs and even your

1057
00:56:44,160 --> 00:56:48,560
maintenance and CapEx, that is
where accessing that large

1058
00:56:48,560 --> 00:56:50,960
language model.
This is where there's like a bit

1059
00:56:50,960 --> 00:56:54,840
of a CapEx piece for the.
So there's like a heap of third

1060
00:56:54,840 --> 00:56:57,800
party stuff, right, which you
can grab and plug in and

1061
00:56:57,800 --> 00:56:59,040
optimize yourself with right
now.

1062
00:56:59,080 --> 00:57:00,400
Yeah, deep.
Seek in there.

1063
00:57:00,400 --> 00:57:02,440
Open sourcing is a good example
for today, yeah.

1064
00:57:02,440 --> 00:57:05,160
Yeah, and and I think what
you're talking about is really

1065
00:57:05,160 --> 00:57:10,520
interesting because there's also
your own universe where you can

1066
00:57:10,560 --> 00:57:15,200
create your own learning.
And I will fuck this up with

1067
00:57:15,200 --> 00:57:16,760
with vernacular and how I
explain this.

1068
00:57:16,760 --> 00:57:21,920
But yes, your own pool and then
and then and then utilize AI to

1069
00:57:21,960 --> 00:57:24,280
analyse that in real time.
So we are looking at exactly

1070
00:57:24,280 --> 00:57:25,760
that.
But it is a CapEx that's.

1071
00:57:25,800 --> 00:57:26,800
Alpha, right?
Yes.

1072
00:57:27,000 --> 00:57:29,560
You've got to keep alpha on your
own service and don't let anyone

1073
00:57:29,560 --> 00:57:31,280
else have access to alpha.
Exactly.

1074
00:57:31,280 --> 00:57:34,320
So what what defines like a
fund's competitive advantage?

1075
00:57:34,320 --> 00:57:39,640
It's it's, it's the funnel of
information that you as a fund

1076
00:57:39,640 --> 00:57:43,320
get that with, you know, that's
deal flow and information.

1077
00:57:43,640 --> 00:57:47,160
You as a fund get better access
to that theoretically than than

1078
00:57:47,160 --> 00:57:50,320
than other people.
If you don't, then you shouldn't

1079
00:57:50,320 --> 00:57:54,040
be doing it.
It's then it's then taking that

1080
00:57:54,040 --> 00:57:57,520
funnel and distilling that to
competitive advantage.

1081
00:57:57,520 --> 00:57:59,240
So that's you've got to
translate that to a knowledge

1082
00:57:59,720 --> 00:58:03,760
and then you've got to translate
that to a competitive advantage.

1083
00:58:03,800 --> 00:58:11,120
And, and sorry, but to do that,
that's your own ecosystem.

1084
00:58:11,120 --> 00:58:13,360
And that's all your inbounds,
like you guys would say this,

1085
00:58:13,360 --> 00:58:15,520
right?
Like your daily inbounds on

1086
00:58:15,520 --> 00:58:17,840
e-mail, all the different
investment banks, all the, and

1087
00:58:17,840 --> 00:58:21,240
you know, it's real time.
That stuff doesn't turn up in

1088
00:58:21,240 --> 00:58:25,640
Bloomberg necessarily or S&P in
these data banks that you can

1089
00:58:25,640 --> 00:58:28,760
plug AI into.
Now that's not necessary that

1090
00:58:28,760 --> 00:58:31,720
that real time flow isn't
necessarily hitting those data

1091
00:58:31,720 --> 00:58:36,440
banks at at in real time.
They're sort of backward looking

1092
00:58:36,440 --> 00:58:38,400
and I would argue they probably
remain so.

1093
00:58:38,680 --> 00:58:42,280
Whereas you as a fund have this
flow of information, you're then

1094
00:58:42,280 --> 00:58:45,760
exchanging it internally,
whether that's through WhatsApp

1095
00:58:45,760 --> 00:58:48,160
or whatever other means of
communication you have e-mail,

1096
00:58:48,160 --> 00:58:50,960
etc.
But you should have AI plugged

1097
00:58:50,960 --> 00:58:53,680
into that unique ecosystem of
your own.

1098
00:58:53,680 --> 00:58:55,280
And that's, that's exactly what
we're mapping.

1099
00:58:55,320 --> 00:58:55,840
Yeah.
And.

1100
00:58:55,840 --> 00:58:59,040
And the that same concept of the
alpha for investing is, is

1101
00:58:59,040 --> 00:59:01,920
exactly what I'm seeing in
mining tech solutions as well,

1102
00:59:01,920 --> 00:59:05,640
right, is that if you have a
domain expertise and you're able

1103
00:59:05,640 --> 00:59:10,200
to serve the entire global
mining industry, you develop

1104
00:59:10,200 --> 00:59:13,480
this capability that is just
unparalleled just the.

1105
00:59:13,480 --> 00:59:16,120
Miner or do you own the data?
Well.

1106
00:59:16,120 --> 00:59:19,080
I think that when you talk about
the raw data, that the miner

1107
00:59:19,080 --> 00:59:20,760
always just needs to own the raw
data.

1108
00:59:20,760 --> 00:59:23,680
And that's really a big part of
what things like GDPR are doing

1109
00:59:23,680 --> 00:59:25,720
for personal data at this point
in time.

1110
00:59:25,720 --> 00:59:29,720
But when it comes to the process
data, right, the actual wisdom

1111
00:59:29,720 --> 00:59:33,320
that can be imparted on by a
domain expert, which at this

1112
00:59:33,320 --> 00:59:36,480
point is still a combination of
a human and AI.

1113
00:59:36,800 --> 00:59:40,800
But in the future, right, you
could have completely autonomous

1114
00:59:41,000 --> 00:59:45,480
organizations offering services
to minds, which is basically

1115
00:59:45,480 --> 00:59:48,240
just input your data here and
your knowledge comes out this

1116
00:59:48,240 --> 00:59:52,280
side and, and that could be
almost completely autonomous,

1117
00:59:52,280 --> 00:59:55,080
right?
It's a big part of where I, I

1118
00:59:55,080 --> 00:59:58,000
see the mining tech landscape
going at the moment.

1119
00:59:58,360 --> 01:00:01,160
And especially when you look at,
you know, what could be

1120
01:00:01,160 --> 01:00:04,880
happening in surface mines,
there's a lot of autonomous tech

1121
01:00:04,880 --> 01:00:08,240
now that's going down the
production drill holes and

1122
01:00:08,240 --> 01:00:11,040
scanning them, right, the
production drill holes.

1123
01:00:11,720 --> 01:00:14,480
Way cheaper because they're
they're just, you know, big air

1124
01:00:14,480 --> 01:00:17,600
drills, they're not diamond core
and they're in a really tight

1125
01:00:17,600 --> 01:00:19,960
spacing.
So from a geologist point of

1126
01:00:19,960 --> 01:00:22,360
view, it's an amazing
opportunity to get a much

1127
01:00:22,360 --> 01:00:27,000
tighter view of the resource and
make sure that you are actually

1128
01:00:27,000 --> 01:00:30,880
sending the right material to
the mill and getting this higher

1129
01:00:30,880 --> 01:00:34,400
metal core factor recovery of
your reserves.

1130
01:00:34,840 --> 01:00:39,120
But at the moment, you know,
where a lot of this tech is

1131
01:00:39,120 --> 01:00:41,720
going is that it's got that
regulatory hurdles.

1132
01:00:42,000 --> 01:00:45,760
And so that's really exactly
where we've got to come back to

1133
01:00:45,760 --> 01:00:49,400
this core point of how do we
ensure that the the board is

1134
01:00:49,400 --> 01:00:51,560
going to look at these
opportunities and go, we're

1135
01:00:51,560 --> 01:00:53,720
going to push for this, we're
going to do this innovation.

1136
01:00:53,720 --> 01:00:54,760
Yeah, it's it's.
Interesting.

1137
01:00:54,760 --> 01:00:58,280
I reckon like the darker here is
going to be whether whether you

1138
01:00:58,280 --> 01:01:01,080
know, you're an investor like us
or, or, or a corporate and an

1139
01:01:01,080 --> 01:01:03,640
operator, you know, a mining
company, it's going to be like,

1140
01:01:03,640 --> 01:01:09,960
how do you, how do you utilize
AI in a bespoke way to your own

1141
01:01:09,960 --> 01:01:12,280
data lake?
That's your, that's the only

1142
01:01:12,280 --> 01:01:16,240
Moat that you own here.
That as far as I can see, it's

1143
01:01:16,240 --> 01:01:19,520
not going to be like, how do I
apply, you know, some really

1144
01:01:19,520 --> 01:01:24,320
smart thinking to a universally
available pool of data that that

1145
01:01:24,320 --> 01:01:27,280
anybody, everybody's going to be
racing at the same time for

1146
01:01:27,280 --> 01:01:29,160
that.
So it's like, how do you, how do

1147
01:01:29,200 --> 01:01:32,520
you define your own data lake?
And then how do you, you know,

1148
01:01:32,520 --> 01:01:34,360
build the AI infrastructure into
that?

1149
01:01:34,360 --> 01:01:37,600
And it's, it's a hard one.
Like we're working through all

1150
01:01:37,600 --> 01:01:40,560
of that because there's a
component of capital required to

1151
01:01:40,560 --> 01:01:41,680
do that.
These things aren't cheap.

1152
01:01:42,200 --> 01:01:45,120
And then you don't want to
invest that capital in six

1153
01:01:45,120 --> 01:01:47,840
months, 12 months later have it,
you know, redundant because

1154
01:01:47,840 --> 01:01:49,920
there's something off the shelf
or, or whatever.

1155
01:01:49,920 --> 01:01:54,000
So yeah, yeah, I I don't have
all the answers where there's

1156
01:01:54,080 --> 01:01:56,320
not working.
Like to the to the point of of

1157
01:01:56,880 --> 01:01:58,920
the technology being an
equalizer in some respect, like

1158
01:01:59,680 --> 01:02:01,880
to our markets not just get
tremendously more efficient

1159
01:02:01,880 --> 01:02:04,880
because kind of everyone's got I
know, I know there is still a

1160
01:02:04,880 --> 01:02:08,280
Moat in the fund with its own
information and synthesizing all

1161
01:02:08,280 --> 01:02:09,880
of that, but you would.
Think so.

1162
01:02:09,880 --> 01:02:13,080
I reckon the really interesting
part for commodities is that

1163
01:02:13,320 --> 01:02:17,160
it's such an unsexy part of the
equity world, you know, as an

1164
01:02:17,160 --> 01:02:19,200
asset class, no one want, no one
gives a fuck.

1165
01:02:19,520 --> 01:02:26,560
Yeah, commodities, we've got 70%
of every equity dollar globally

1166
01:02:26,560 --> 01:02:29,480
sits in the US and some huge
fraction of that in in the mag

1167
01:02:29,480 --> 01:02:31,760
7.
So like, the concentration of

1168
01:02:31,760 --> 01:02:34,080
capital is massive.
No one's really looking at the

1169
01:02:34,080 --> 01:02:37,720
sector.
It's an enormously specialized

1170
01:02:37,720 --> 01:02:40,920
space.
Like the knowledge required to

1171
01:02:40,920 --> 01:02:45,720
really land somewhere informed,
I think is enormously

1172
01:02:45,720 --> 01:02:46,880
specialized.
And.

1173
01:02:47,360 --> 01:02:50,040
And so I reckon there is this
window here to get this right.

1174
01:02:50,040 --> 01:02:51,720
That's what excites me.
I reckon coming back from

1175
01:02:51,720 --> 01:02:54,800
Christmas and being like, Oh
yeah, no, you know, I'm pretty

1176
01:02:54,800 --> 01:02:57,760
pumped about getting this right
in our, our space because it

1177
01:02:57,760 --> 01:03:00,800
isn't sexy.
And resources, commodities,

1178
01:03:00,800 --> 01:03:04,280
mining, it's never been sexy.
So I feel like we're going to

1179
01:03:04,280 --> 01:03:09,120
get a bit of a longer window
maybe to apply all the stuff

1180
01:03:09,120 --> 01:03:12,520
that the tech side of things is
bringing because the tech side

1181
01:03:12,760 --> 01:03:14,800
naturally applies it to its own
environment.

1182
01:03:14,800 --> 01:03:17,760
So that where have we seen a,
where have we seen AI applied

1183
01:03:17,760 --> 01:03:20,120
first?
It's like, oh, it's in coding,

1184
01:03:20,360 --> 01:03:25,640
it's in graphics, it's these are
all like tech places because

1185
01:03:25,640 --> 01:03:28,240
it's their, it's just a natural
thing for them.

1186
01:03:28,240 --> 01:03:31,880
It's their ecosystem.
I feel like we get an

1187
01:03:31,880 --> 01:03:36,400
opportunity, particularly in
mining finance to get really

1188
01:03:36,400 --> 01:03:39,280
lateral and I think we'll I
don't think mining is going to

1189
01:03:39,280 --> 01:03:42,840
become sexy tomorrow.
So, so that that excites me.

1190
01:03:42,840 --> 01:03:46,080
I feel like, you know that, that
that's cool and like you guys

1191
01:03:46,080 --> 01:03:49,840
would know this so well, right?
Like information asymmetry down.

1192
01:03:49,840 --> 01:03:54,160
There's nowhere I don't, I know
of where it exists more than in

1193
01:03:54,160 --> 01:03:56,200
than in mining.
And maybe that's a function of

1194
01:03:56,320 --> 01:03:58,160
how many small caps we have.
I'm not sure.

1195
01:03:58,160 --> 01:04:02,280
But without question, it, it, it
exists as you go down the market

1196
01:04:02,280 --> 01:04:04,600
cap curve.
And I think that talks like

1197
01:04:04,600 --> 01:04:07,040
incentives and, and why it
exists.

1198
01:04:07,040 --> 01:04:10,040
I think that will stay that way.
And if you can apply, if you can

1199
01:04:10,040 --> 01:04:14,440
get even better at thinking
outside the square, then then

1200
01:04:14,600 --> 01:04:17,000
then great.
The next step is probably like,

1201
01:04:17,000 --> 01:04:19,880
how do we then how do we then
interface that?

1202
01:04:19,880 --> 01:04:21,960
And that's certainly something,
you know, I know we need to do a

1203
01:04:21,960 --> 01:04:23,320
lot better.
It's like, how do we then

1204
01:04:23,320 --> 01:04:26,560
interface that back to the
market to say, Hey, hey, you've,

1205
01:04:26,680 --> 01:04:29,200
you've missed something.
This is, this is you should

1206
01:04:29,200 --> 01:04:32,600
allocate capital here.
That bit, I think in small cap

1207
01:04:32,600 --> 01:04:36,000
mining we could do way better.
I, I would argue that we see

1208
01:04:36,000 --> 01:04:38,800
companies and assets that
shouldn't get funded, funded all

1209
01:04:38,800 --> 01:04:41,040
the time.
And we see companies and assets

1210
01:04:41,040 --> 01:04:43,040
that should get funded, not get
funded.

1211
01:04:43,680 --> 01:04:47,280
And and so I would say like the
efficient allocation of capital

1212
01:04:47,280 --> 01:04:51,520
in, in metals and mining it,
there's a huge room for

1213
01:04:51,520 --> 01:04:56,800
improvement in in doing that.
Is is mining not being sexy just

1214
01:04:56,800 --> 01:04:58,640
from that capital allocation
perspective?

1215
01:04:58,800 --> 01:05:00,240
No one.
Wants to go to a fucking dinner

1216
01:05:00,240 --> 01:05:01,800
party and say I invest in
mining.

1217
01:05:01,880 --> 01:05:03,720
No, no one.
It's like saying I pick up the

1218
01:05:03,720 --> 01:05:04,800
garbage.
I try.

1219
01:05:04,800 --> 01:05:09,120
Saying you're a podcaster.
It's not working for you man.

1220
01:05:10,240 --> 01:05:13,440
Yeah, I just, I, I find I really
have a love for mining, right?

1221
01:05:13,440 --> 01:05:16,040
And maybe, maybe I'm one of
those rare people who think it

1222
01:05:16,040 --> 01:05:18,240
is sexy.
But when I look at everything in

1223
01:05:18,240 --> 01:05:21,200
this world, especially the, the
more tech we've got, right,

1224
01:05:21,200 --> 01:05:25,080
there's so much copper we need
for this clean energy transition

1225
01:05:25,080 --> 01:05:27,520
and and nuclear is a massive
part of that, right.

1226
01:05:27,520 --> 01:05:30,640
So we've got to get our hands on
all the nuclear resources to and

1227
01:05:30,880 --> 01:05:34,320
work out that supply chain to
turn these raw minerals into the

1228
01:05:34,320 --> 01:05:37,120
products we need to advance and
grow.

1229
01:05:37,480 --> 01:05:39,360
Yeah, right.
It's, it's an immensely,

1230
01:05:40,440 --> 01:05:43,240
immensely cool journey that I
think I'm really proud to play a

1231
01:05:43,240 --> 01:05:46,520
role in.
So particularly when you see it

1232
01:05:46,520 --> 01:05:50,360
from the perspective that all of
these resources are one in a

1233
01:05:50,520 --> 01:05:53,600
billion occurrences, freak
events that happened millions of

1234
01:05:53,600 --> 01:05:56,720
years ago in our Earth's
formation, you see them as

1235
01:05:56,720 --> 01:05:58,840
unique.
And I think that's why I love

1236
01:05:58,840 --> 01:06:02,440
working with geologists, right?
They have this real respect for

1237
01:06:02,440 --> 01:06:05,720
the resource.
They love not only to discover

1238
01:06:05,720 --> 01:06:10,280
and define it, but protect it
right when when we're mining I I

1239
01:06:10,280 --> 01:06:13,360
know that some people can see a
giant hole in the ground as a

1240
01:06:13,360 --> 01:06:17,720
scar on the earth, but I've seen
some massive pits in my time and

1241
01:06:17,720 --> 01:06:20,040
they are feats of engineering I
reckon.

1242
01:06:20,080 --> 01:06:21,440
I come across cheerleaders all
the time.

1243
01:06:21,440 --> 01:06:23,720
They don't want to see it mine,
they they love the ore body

1244
01:06:23,720 --> 01:06:25,440
itself.
It's a bit like the the golf

1245
01:06:25,440 --> 01:06:27,440
course curator.
He doesn't want to see anybody

1246
01:06:27,440 --> 01:06:29,720
out there playing fucking golf,
cutting up his course.

1247
01:06:30,680 --> 01:06:34,440
I I share the love and and like,
I like being in a unsexy

1248
01:06:34,440 --> 01:06:35,800
industry.
I reckon it's rad.

1249
01:06:35,800 --> 01:06:40,400
I love going against the grade.
I think it it takes way more.

1250
01:06:41,120 --> 01:06:43,200
I guess I just got mad respect
for anybody who does it.

1251
01:06:43,200 --> 01:06:45,960
Well, yeah.
And so, yeah, I agree with you.

1252
01:06:45,960 --> 01:06:48,600
I think the sector is so
important.

1253
01:06:48,800 --> 01:06:53,160
This this you know, this fallacy
that we're, you know, we see

1254
01:06:53,440 --> 01:06:56,240
permeated time and time again
from groups like the EU that we

1255
01:06:56,240 --> 01:07:00,000
can live in some, you know,
world of of, you know, fully,

1256
01:07:00,000 --> 01:07:03,680
fully self-sufficient recycling
is just fucking fantasy.

1257
01:07:03,760 --> 01:07:07,040
It is utter fantasy.
And, and yeah, we've barely

1258
01:07:07,040 --> 01:07:09,120
scratched the surface of the
earth really with mining.

1259
01:07:09,840 --> 01:07:14,440
There's no true limitation to,
to what we can access.

1260
01:07:14,440 --> 01:07:18,160
It's just a cost function.
You know, there is no peak oil,

1261
01:07:18,680 --> 01:07:21,640
There isn't.
There are enormously abundant

1262
01:07:21,800 --> 01:07:25,160
quantums of, of resource
available within the planet and

1263
01:07:25,920 --> 01:07:28,040
it's purely a cost function to,
to access them.

1264
01:07:28,040 --> 01:07:31,760
And without question, I think
we're going to need a lot more

1265
01:07:31,760 --> 01:07:34,520
energy and a lot more
commodities because we've got a

1266
01:07:34,520 --> 01:07:37,800
lot of people still to pull out
of poverty, a lot of people just

1267
01:07:37,800 --> 01:07:40,640
to, to get to a reasonable
living standard.

1268
01:07:40,880 --> 01:07:44,400
And then we've got the first
world going to a place of energy

1269
01:07:44,400 --> 01:07:46,960
intensity that it never had
before.

1270
01:07:46,960 --> 01:07:50,000
We're taking, we're about to
jump a big quantum leap in

1271
01:07:50,000 --> 01:07:54,160
energy usage per person because
of data and because of AI and,

1272
01:07:54,160 --> 01:07:57,280
and I unashamedly think that's a
good thing.

1273
01:07:57,480 --> 01:08:01,360
There's nothing bad about that.
I, I know we'll get lots of

1274
01:08:01,960 --> 01:08:06,080
climate change arguments.
Climate change is probably the

1275
01:08:06,080 --> 01:08:10,920
worst used vernacular I've ever
seen climates change.

1276
01:08:11,120 --> 01:08:15,760
That is a fundamental tenet of
existence on this earth change.

1277
01:08:16,160 --> 01:08:21,560
And so it's, it's one of the
worst misuses of, of verbiage

1278
01:08:21,560 --> 01:08:24,800
I've seen.
And, and, and, and, and you

1279
01:08:24,800 --> 01:08:28,520
know, the, the things that we
can do is just get, get, get

1280
01:08:28,520 --> 01:08:31,640
better at being industrial.
If you're a true

1281
01:08:31,640 --> 01:08:36,880
environmentalist, mining is a
key function in, in a lower

1282
01:08:36,880 --> 01:08:39,800
impact on the planet.
And I know that sounds very

1283
01:08:39,800 --> 01:08:43,600
counter intuitive, but a true
environmentalist in my mind

1284
01:08:44,240 --> 01:08:50,120
wants the most energy efficient
system, agricultural efficiency.

1285
01:08:50,120 --> 01:08:55,000
So get the fuck off the land
intensively farm, get the most

1286
01:08:55,200 --> 01:08:57,960
out of, out of your, your, your
interface with the land as you

1287
01:08:57,960 --> 01:09:03,520
can live in, in cities and in
urban environments, and then,

1288
01:09:03,560 --> 01:09:05,720
and then leave as much nature
untouched as you can.

1289
01:09:05,720 --> 01:09:07,439
That, that, that's a real
environmentalist.

1290
01:09:07,439 --> 01:09:12,200
And that involves nuclear and,
and, you know, energy efficiency

1291
01:09:12,200 --> 01:09:14,680
and, and, and all of these sorts
of modern things that we're all

1292
01:09:14,680 --> 01:09:16,880
doing.
And yet bizarrely, the left

1293
01:09:16,880 --> 01:09:20,520
continues to seem to fight us on
and sell this idea that you

1294
01:09:21,200 --> 01:09:24,399
know, you know, this fatalism
and we're all fucked and humans

1295
01:09:24,399 --> 01:09:27,680
are it's just garbage.
The best thing we can do is keep

1296
01:09:27,680 --> 01:09:31,800
being progressive, keep keep
digging holes, keep getting

1297
01:09:31,800 --> 01:09:34,439
better at it, keep educating
people.

1298
01:09:34,439 --> 01:09:37,359
If you want to get human impact
lower.

1299
01:09:38,240 --> 01:09:40,200
Well, less people is probably a
start.

1300
01:09:40,200 --> 01:09:45,200
And the key to getting less
people is actually rising GDP

1301
01:09:45,200 --> 01:09:48,279
and education.
So sorry, I know that's way off

1302
01:09:48,279 --> 01:09:50,560
topic, but let's.
Get even more philosophical

1303
01:09:51,200 --> 01:09:58,400
trustee in a in an AI augmented
society, where does where does

1304
01:09:58,400 --> 01:10:03,840
capital flow?
Oh, this one's I think we

1305
01:10:03,840 --> 01:10:05,120
touched on this a little bit
earlier.

1306
01:10:05,400 --> 01:10:11,240
Like so I saw Elon made some
comments like, do we even need

1307
01:10:11,240 --> 01:10:16,120
money?
Like I was like, whoa, OK, I

1308
01:10:16,120 --> 01:10:21,000
think I think that might be, we
probably don't have to jump

1309
01:10:21,000 --> 01:10:23,440
there.
Let's let's say on the path to

1310
01:10:23,440 --> 01:10:26,640
90% of everything getting
fundamentally cheaper and you've

1311
01:10:26,640 --> 01:10:28,480
got this pool of capital.
It's like, well, where do you

1312
01:10:28,480 --> 01:10:29,920
allocate it?
What do you do with it?

1313
01:10:31,640 --> 01:10:35,000
I would say in terms of asset
classes, you're going to chase

1314
01:10:36,800 --> 01:10:39,760
assets that you can't create,
goods and services that you

1315
01:10:39,760 --> 01:10:45,800
can't just create.
So collectibles, IE pieces of

1316
01:10:45,800 --> 01:10:49,240
art, they're going to, you know,
you're going to chase those.

1317
01:10:49,760 --> 01:10:54,440
They're one offs by definition.
Hard assets, I would think

1318
01:10:54,560 --> 01:10:58,000
property like you're going to,
you're going to scramble for

1319
01:10:58,000 --> 01:11:01,400
property globally.
You can't, you're not making any

1320
01:11:01,400 --> 01:11:06,280
more earth.
And, and, and I think that that

1321
01:11:06,280 --> 01:11:09,360
equally applies to, to
commodities and, and all bodies,

1322
01:11:09,440 --> 01:11:13,520
things that are in the ground.
So I I and then beyond that, I,

1323
01:11:13,520 --> 01:11:19,680
I, I, I don't know, extending
your existence for a few

1324
01:11:19,680 --> 01:11:23,600
thousand years and, and these
lots of concepts, you know, it's

1325
01:11:23,600 --> 01:11:25,520
going to be a fucking mind
blowing world.

1326
01:11:25,520 --> 01:11:27,760
I mean, like you can, you can
start having conversations

1327
01:11:27,760 --> 01:11:31,000
around this concept of human
ascension and like, you know, do

1328
01:11:31,000 --> 01:11:32,560
you move from the physical to
the.

1329
01:11:34,000 --> 01:11:38,120
Yeah, 1 interesting AI concept
is to replicate your

1330
01:11:38,120 --> 01:11:40,320
consciousness on a computer,
right?

1331
01:11:40,320 --> 01:11:43,400
And you can have this debate
about whether it is conscious or

1332
01:11:43,400 --> 01:11:45,680
not.
But what's interesting is that

1333
01:11:45,720 --> 01:11:48,280
it's a program running on a
computer that's powered by

1334
01:11:48,280 --> 01:11:50,040
electricity and that costs
money.

1335
01:11:50,040 --> 01:11:54,240
So even if you can emulate your
consciousness, you still have to

1336
01:11:54,240 --> 01:11:56,840
have a job, you still got to pay
outgoings, right?

1337
01:11:56,840 --> 01:12:00,960
And you kind of ascension in all
these AI sensors always still

1338
01:12:00,960 --> 01:12:04,040
constrained.
And so I like that idea of

1339
01:12:04,040 --> 01:12:06,720
exploration, right?
And hopefully where the capital

1340
01:12:06,720 --> 01:12:09,640
flow when real estate starts to
get really high and if we

1341
01:12:09,640 --> 01:12:12,600
struggle with our commodity
problems is to colonize on other

1342
01:12:12,600 --> 01:12:15,240
planets, right?
And another potential part of

1343
01:12:15,240 --> 01:12:17,560
that commodity mix is asteroid
mining.

1344
01:12:17,840 --> 01:12:20,640
You know, there's so much metal
out there in the rest of the

1345
01:12:20,640 --> 01:12:24,280
solar system and it doesn't have
those environmental constraints.

1346
01:12:24,280 --> 01:12:26,120
As far as we know, there's no
life out there.

1347
01:12:26,560 --> 01:12:29,560
And it's really interesting to
think, you know, how a supply

1348
01:12:29,560 --> 01:12:32,800
and demand curve could change if
you all of a sudden discover 10

1349
01:12:32,800 --> 01:12:35,520
times the amount of copper
that's ever existed on the Earth

1350
01:12:35,520 --> 01:12:37,760
in a low hanging asteroid,
right?

1351
01:12:37,760 --> 01:12:41,800
And so you add that in with also
trying to expand throughout the

1352
01:12:41,800 --> 01:12:43,360
solar system.
You know, there's some pretty

1353
01:12:43,360 --> 01:12:46,280
cool moons near Saturn as well
that we could colonize.

1354
01:12:46,280 --> 01:12:51,360
And it really is important to de
risk our only one planet that

1355
01:12:51,360 --> 01:12:54,520
we're currently not looking
after the best by by expanding

1356
01:12:54,520 --> 01:12:56,440
and colonizing the rest of the
Galaxy, was it?

1357
01:12:56,440 --> 01:12:58,240
Bezos that was talking about
that, were you guys?

1358
01:12:58,480 --> 01:13:00,880
Yeah, 100%.
Bezos was saying get rid of all

1359
01:13:01,440 --> 01:13:04,560
the emissions intensive
industries, put them out there.

1360
01:13:04,680 --> 01:13:07,680
Yeah, because everyone that's
ever gone out there looks back

1361
01:13:07,680 --> 01:13:10,720
at Earth and says, hey, it
doesn't get better than this.

1362
01:13:10,720 --> 01:13:12,280
This is the this is the one.
So the planet.

1363
01:13:12,280 --> 01:13:16,960
Becomes a National Park and
then, yeah, it's interesting.

1364
01:13:17,040 --> 01:13:20,280
I, I, I, I don't mind like the
starting premise that the planet

1365
01:13:20,280 --> 01:13:22,080
becomes a, a kind of a National
Park.

1366
01:13:22,080 --> 01:13:23,800
I think, I think there's some
merit to that.

1367
01:13:23,800 --> 01:13:26,280
But I think if we step back
through the conversation we just

1368
01:13:26,280 --> 01:13:30,080
had around like underground
mining and, and how low impact

1369
01:13:30,080 --> 01:13:33,840
you can make it, you know, how's
that going to compete from a

1370
01:13:33,840 --> 01:13:37,040
cost function point of view with
going all the way to a bloody

1371
01:13:37,040 --> 01:13:41,520
asteroid or another planet and
pulling, you know, but our vast

1372
01:13:41,520 --> 01:13:44,920
sums of rock back to, you know,
even refined metal back back to

1373
01:13:44,920 --> 01:13:48,160
planet Earth.
I to me that intuitively feels

1374
01:13:48,160 --> 01:13:49,520
like that could be a long way
away.

1375
01:13:49,520 --> 01:13:52,440
And and I feel like we can, we
can interface with the planet we

1376
01:13:52,440 --> 01:13:57,080
have far better and and with
with much less impact, but.

1377
01:13:57,080 --> 01:13:59,880
If if we're trying to live in
low Earth orbit, actually

1378
01:13:59,880 --> 01:14:02,560
lifting the metals off the
ground to put them into low

1379
01:14:02,560 --> 01:14:04,560
Earth orbit is incredibly
expensive.

1380
01:14:04,560 --> 01:14:07,720
So it makes more sense to pull
them out into orbit.

1381
01:14:07,840 --> 01:14:12,680
So that whole low Earth orbit
kind of habitable zone, it's got

1382
01:14:12,680 --> 01:14:14,440
to come from asteroid mining,
right?

1383
01:14:14,720 --> 01:14:17,480
And like again, you know, you'd
essentially have those two

1384
01:14:17,480 --> 01:14:19,880
commodity prices of where is it
located?

1385
01:14:19,880 --> 01:14:22,760
Is it on growth Earth and the
gravity well or is it in low

1386
01:14:22,760 --> 01:14:25,280
Earth orbit?
Any any any colonization

1387
01:14:25,280 --> 01:14:28,480
elsewhere like any any other
place other than planet Earth

1388
01:14:28,480 --> 01:14:32,440
and the the the raw materials
required for that process will

1389
01:14:32,720 --> 01:14:37,040
will not come from planet Earth.
I wouldn't have thought, yeah, I

1390
01:14:37,040 --> 01:14:38,800
hope we're all around to see all
this stuff.

1391
01:14:38,800 --> 01:14:41,600
Like I think this goes like
where do you spell spend your

1392
01:14:41,600 --> 01:14:45,440
capital will other than sort of
asset allocation and and trying

1393
01:14:45,440 --> 01:14:48,880
to get to you might what I just
explained people might as well I

1394
01:14:48,920 --> 01:14:51,320
don't give a fuck.
Like I don't need any return on

1395
01:14:51,320 --> 01:14:53,240
capital.
I'm pretty self-sufficient or,

1396
01:14:53,640 --> 01:14:56,960
you know, it doesn't cost me
much to to to live.

1397
01:14:57,640 --> 01:15:01,000
Maybe you're prioritizing your
illness and and your longevity

1398
01:15:01,000 --> 01:15:06,040
and and we're seeing huge floods
of capital into into living for,

1399
01:15:06,280 --> 01:15:09,720
you know, massively extended
periods of time in various ways,

1400
01:15:09,720 --> 01:15:12,080
shapes or forms.
I, I, I could definitely see

1401
01:15:12,080 --> 01:15:15,640
that as a place that people
start really getting after and

1402
01:15:15,640 --> 01:15:19,480
particularly when you apply AI
into that like the medical

1403
01:15:19,480 --> 01:15:21,480
space.
I, I would love to, I'm sure

1404
01:15:21,760 --> 01:15:24,680
there'll be many podcasts out
there on, on that I'm sure, but

1405
01:15:25,440 --> 01:15:28,040
without question, right, we're
going to advance those fields

1406
01:15:28,600 --> 01:15:30,400
pretty quickly I I think from
here.

1407
01:15:30,680 --> 01:15:33,200
So living.
Forever sounds a bit dystopian

1408
01:15:33,440 --> 01:15:34,560
to to me.
Rusty, but well.

1409
01:15:34,840 --> 01:15:38,440
You get stuck in concepts like
what is this and, and you know,

1410
01:15:38,600 --> 01:15:41,680
what's on the backside of this
and should you really live here

1411
01:15:41,680 --> 01:15:44,480
forever?
And yeah, all sorts of weird and

1412
01:15:44,480 --> 01:15:47,800
wonderful stuff.
So, yeah, there's, there's so

1413
01:15:47,800 --> 01:15:51,360
many fascinating big picture
questions.

1414
01:15:51,360 --> 01:15:53,040
And I think we're going to get
so many more answers to the

1415
01:15:53,040 --> 01:15:54,640
universe.
Like it's such a it's such an

1416
01:15:54,640 --> 01:15:57,720
awesome time to be alive.
Like, you know, we've gone

1417
01:15:57,720 --> 01:15:59,960
through this period where I
think science has like

1418
01:16:00,560 --> 01:16:04,840
demystified the Mystic part of
humanity, if you like, so that,

1419
01:16:04,840 --> 01:16:06,880
you know, that it used to be
like a lot of faith, like a lot

1420
01:16:06,880 --> 01:16:10,280
of religion and a lot of these
sorts of things dominated human

1421
01:16:10,280 --> 01:16:12,960
thinking.
I think science has has smoothed

1422
01:16:12,960 --> 01:16:14,680
a lot of that out, if you like,
at the time.

1423
01:16:14,680 --> 01:16:17,640
I think that fascinating part of
science at the moment is like in

1424
01:16:17,640 --> 01:16:20,800
the quantum realm, it's sort of
re mystifying things.

1425
01:16:21,080 --> 01:16:23,040
So we've gone from this place of
like, oh, we, we sort of

1426
01:16:23,040 --> 01:16:24,840
understand it all.
We get how it works.

1427
01:16:24,840 --> 01:16:27,280
You guys are fucking you guys
are dreaming.

1428
01:16:27,280 --> 01:16:29,640
You know what you're thinking
with your 3000 religions.

1429
01:16:29,720 --> 01:16:33,280
Suddenly we're like, oh, quantum
is telling us that a particle

1430
01:16:33,280 --> 01:16:36,240
can communicate with another
particle in some other part of

1431
01:16:36,240 --> 01:16:39,000
the universe or, you know,
without it, without any sort of

1432
01:16:39,560 --> 01:16:42,440
physical tangible cause and
effect.

1433
01:16:42,480 --> 01:16:45,400
Like, what the fuck?
So it's a.

1434
01:16:45,400 --> 01:16:48,080
Simulation layer, Yeah.
The simulation, The simulation

1435
01:16:48,080 --> 01:16:50,240
layer, yeah.
There's there's such cool stuff

1436
01:16:50,240 --> 01:16:52,840
coming out of science now,
ironically, that he's almost re

1437
01:16:52,840 --> 01:16:54,920
mystifying our existence in our
universe.

1438
01:16:54,920 --> 01:16:58,400
And you know, I find it faster.
I'd I'd sort of always grown up

1439
01:16:58,400 --> 01:17:02,320
quite sort of atheist in a way.
And I and I think as times going

1440
01:17:02,320 --> 01:17:08,440
on, I look at like every human
civilization that we're aware of

1441
01:17:08,960 --> 01:17:12,160
has found some form of deity.
You know, why is that?

1442
01:17:12,160 --> 01:17:16,160
That's a remarkable coincidence.
Now, clearly they can't all be

1443
01:17:16,160 --> 01:17:19,880
right, but an atheist suggests
they're all wrong.

1444
01:17:20,400 --> 01:17:24,240
Is it just an interpretation of
the same thing in different

1445
01:17:24,240 --> 01:17:25,760
ways?
And then, you know, religions as

1446
01:17:25,760 --> 01:17:28,160
they do anything that controls
people gets manipulated,

1447
01:17:28,160 --> 01:17:32,320
etcetera, etcetera, and they
they form these different

1448
01:17:32,320 --> 01:17:34,880
things.
But, but the sheer fact that

1449
01:17:35,320 --> 01:17:40,600
every, every civilization finds
deity and why is that?

1450
01:17:40,720 --> 01:17:42,440
That's, that's a fascinating
concept.

1451
01:17:42,440 --> 01:17:46,800
And so I, you know, I, I wonder
if science is actually leading

1452
01:17:46,800 --> 01:17:49,480
us to a place of, of
reacquainting ourselves with,

1453
01:17:49,480 --> 01:17:51,800
with, with this Mystic side of
the universe.

1454
01:17:51,800 --> 01:17:56,320
If you like, and, and through
quantum and, and, and, you know,

1455
01:17:56,320 --> 01:17:59,240
sacred geometry and these sorts
of places that they're fucking

1456
01:17:59,240 --> 01:18:00,880
awesome.
There's there's so much cool

1457
01:18:00,880 --> 01:18:02,600
shit.
Do do you ever worry?

1458
01:18:02,600 --> 01:18:04,040
I've thought a bit about this
lately.

1459
01:18:04,040 --> 01:18:07,600
You go back sort of 3 or 400
years and you think about, say,

1460
01:18:07,920 --> 01:18:11,520
Europe, the Catholic Church,
people like Galileo getting

1461
01:18:11,520 --> 01:18:14,360
completely chastised because
they didn't agree and he was

1462
01:18:14,360 --> 01:18:18,880
coming from a scientific sort of
basis and that wasn't what the

1463
01:18:18,880 --> 01:18:23,640
herd thought was right.
And I reflect on like where we

1464
01:18:23,640 --> 01:18:26,800
are today, and this is really
paraphrasing.

1465
01:18:26,800 --> 01:18:29,800
People's far smarter than I am,
but they think about their

1466
01:18:29,800 --> 01:18:33,720
scientific community and how
much people can be chastised for

1467
01:18:33,840 --> 01:18:36,360
climate change, like you said
earlier, and thinking different.

1468
01:18:36,360 --> 01:18:37,920
Is that something you think
about much?

1469
01:18:37,920 --> 01:18:40,440
It's a huge.
Issue it's like anyone who says

1470
01:18:40,440 --> 01:18:44,440
the science is settled is
immediately unscientific Science

1471
01:18:44,440 --> 01:18:49,640
by definition is the constant re
questioning of of kind of

1472
01:18:50,520 --> 01:18:54,400
assumptions and and anything
that has a forward prognosis,

1473
01:18:54,400 --> 01:18:57,200
which climate change does, you
can't say it's it's never

1474
01:18:57,200 --> 01:18:59,640
settled.
It's all, it's, it's, it's

1475
01:18:59,640 --> 01:19:02,720
correlation, causation.
These are all models and

1476
01:19:02,720 --> 01:19:04,120
guesswork.
And, you know, I grew up, I

1477
01:19:04,400 --> 01:19:08,400
mean, you guys would have
experienced acid rain and CFCS

1478
01:19:08,400 --> 01:19:12,080
and, you know, we, we, we moved
from 1 fatalist concept to the

1479
01:19:12,080 --> 01:19:14,080
next.
And it happens at this time.

1480
01:19:14,080 --> 01:19:15,240
They've called it climate
change.

1481
01:19:15,240 --> 01:19:18,400
I, I, I think that, yeah, yeah,
there probably is correlation

1482
01:19:18,400 --> 01:19:20,400
around global warming.
Is that going to eat the planet,

1483
01:19:20,400 --> 01:19:20,960
love?
No.

1484
01:19:20,960 --> 01:19:24,240
The planet's had heaps of carbon
throughout its history, much

1485
01:19:24,240 --> 01:19:27,000
higher than it does today.
Got on just fine.

1486
01:19:27,000 --> 01:19:29,600
Crop yields actually go higher
places greener.

1487
01:19:29,920 --> 01:19:32,440
Like I, I, I think we're going
to get in a, in a, in a

1488
01:19:32,440 --> 01:19:36,560
conversation that's hopefully
far more open minded and, and

1489
01:19:36,560 --> 01:19:40,320
questions things and even like
COVID and you know, I feel like,

1490
01:19:40,320 --> 01:19:44,080
and I don't want to sound too
conspiratorial, I'm sure I do.

1491
01:19:44,080 --> 01:19:47,520
I'm sure I'll get lots of shit
for this, but I, I think you,

1492
01:19:47,600 --> 01:19:51,000
you have to question a lot of
the authority regimes in the

1493
01:19:51,000 --> 01:19:54,520
West and how, you know,
groupthink gets applied and

1494
01:19:54,520 --> 01:19:56,200
you're not allowed to question
things.

1495
01:19:56,200 --> 01:19:58,360
Well, as it turns out, like a
lot of those questions were

1496
01:19:58,360 --> 01:20:03,480
pretty fucking well founded.
So, so yeah, I think, I hope

1497
01:20:03,520 --> 01:20:07,360
that we are in a place where we
can have more honest

1498
01:20:07,360 --> 01:20:10,600
conversations around things that
aren't set, they aren't defined.

1499
01:20:10,640 --> 01:20:13,980
We're constantly learning.
We should be constantly re

1500
01:20:13,980 --> 01:20:16,920
evaluating.
And that's OK, Uncertainty is

1501
01:20:16,920 --> 01:20:19,040
fine.
I think humans very naturally

1502
01:20:19,040 --> 01:20:21,360
gravitate towards this want for
certainty.

1503
01:20:21,680 --> 01:20:24,600
And it's a bullshit illusion.
They're, they're the only

1504
01:20:24,600 --> 01:20:26,240
certainty we have is
uncertainty.

1505
01:20:26,640 --> 01:20:29,680
And, and we have to just live
with probabilistic estimations.

1506
01:20:29,680 --> 01:20:34,560
There are no actual hard things.
I, I, I mean, like, like you

1507
01:20:34,560 --> 01:20:36,760
said, 400 years ago, everybody
thought the Earth was flat,

1508
01:20:37,400 --> 01:20:39,560
right?
That was like the best minds on

1509
01:20:39,560 --> 01:20:42,560
Earth assumed that that was the
correct way of thinking.

1510
01:20:42,560 --> 01:20:44,320
When you look at that now, it's
like laughable.

1511
01:20:44,600 --> 01:20:45,960
These weren't unintelligent
people.

1512
01:20:45,960 --> 01:20:49,880
We haven't like remarkably
gotten smarter over 400 years.

1513
01:20:50,560 --> 01:20:53,040
There was, it was just dogmatic
thinking.

1514
01:20:53,040 --> 01:20:56,920
And I hope that I think, I think
the connectivity of the

1515
01:20:56,920 --> 01:20:59,000
Internet.
I think things like podcasts,

1516
01:20:59,280 --> 01:21:04,680
like I'm really, I feel like
media has LED us down in so many

1517
01:21:04,680 --> 01:21:06,400
ways.
It's become such a part of the

1518
01:21:06,400 --> 01:21:09,520
establishment.
And I feel like now there's so

1519
01:21:09,520 --> 01:21:12,880
many more ways to connect and
have conversations that are a

1520
01:21:12,880 --> 01:21:16,960
bit awkward and a bit difficult,
but be far more open minded

1521
01:21:17,600 --> 01:21:21,120
around possible outcomes.
And I reckon that's that's such

1522
01:21:21,120 --> 01:21:22,840
an awesome thing.
You guys are like, case in

1523
01:21:22,840 --> 01:21:25,240
point, you guys are literally a
case study in that you know,

1524
01:21:25,600 --> 01:21:28,240
you, you.
And I think when I first even

1525
01:21:28,400 --> 01:21:30,320
rang you guys and said, Oh,
fuck, I would really love to

1526
01:21:30,320 --> 01:21:33,560
meet you because I the thing I
love, the thing I was drawn to

1527
01:21:33,560 --> 01:21:36,240
was like, you're having
conversations in the open that

1528
01:21:36,240 --> 01:21:38,200
we were all having behind closed
doors.

1529
01:21:38,400 --> 01:21:41,800
No one was brave enough to come
out and talk publicly about a

1530
01:21:41,800 --> 01:21:45,640
bunch of this analysis.
And, and, and, you know, I think

1531
01:21:45,640 --> 01:21:49,200
that's an awesome thing like
transparency and back to like

1532
01:21:49,200 --> 01:21:54,760
Elon and Doji, transparency
brings awareness and awareness

1533
01:21:54,760 --> 01:21:56,840
brings change.
And all we're trying to do, I

1534
01:21:56,840 --> 01:21:58,800
think is a species is just get
better.

1535
01:21:59,560 --> 01:22:01,080
I think we can all agree on
that.

1536
01:22:01,400 --> 01:22:05,920
And I think the key elements to
that are transparency, awareness

1537
01:22:05,920 --> 01:22:10,320
and change.
And I think the modern era is

1538
01:22:10,320 --> 01:22:12,800
bringing the, those levels of
transparency.

1539
01:22:12,800 --> 01:22:16,280
We're no longer getting dictated
to by, by governments and

1540
01:22:16,280 --> 01:22:18,800
central organizations.
And I reckon that's a fucking

1541
01:22:18,800 --> 01:22:21,240
remarkably good thing.
The smaller we can make

1542
01:22:21,240 --> 01:22:24,080
governments, the better.
The more efficient we can make

1543
01:22:24,080 --> 01:22:27,960
them, the better.
The more we can freethink and,

1544
01:22:27,960 --> 01:22:31,840
and innovate and, and be human,
the better.

1545
01:22:31,960 --> 01:22:35,080
So yeah, I'm pumped.
I, I, I love where we're going.

1546
01:22:35,080 --> 01:22:36,640
It feels very uncomfortable for
some people.

1547
01:22:36,640 --> 01:22:39,800
I, I, it blows my mind that very
rational people feel very

1548
01:22:39,800 --> 01:22:41,480
uncomfortable about these
things.

1549
01:22:41,480 --> 01:22:43,840
I think that's.
A pretty mystical place to end

1550
01:22:43,840 --> 01:22:47,440
the conversation, Rusty Price.
It's been an absolute pleasure

1551
01:22:47,680 --> 01:22:50,560
having the both of you to
explore this new territory.

1552
01:22:50,920 --> 01:22:52,640
Thanks.
So much for for having us and

1553
01:22:52,640 --> 01:22:55,480
thanks so much for being open to
having these chats.

1554
01:22:55,480 --> 01:22:58,760
Made it I I love it.
Appreciate it guys, anytime.

1555
01:22:58,760 --> 01:23:00,920
Gents, thank you so much for the
route.

1556
01:23:02,640 --> 01:23:07,960
Right, there you go, Rusty's
looking better than ever.

1557
01:23:09,520 --> 01:23:12,320
It was plenty for for people to
chew on in there.

1558
01:23:12,320 --> 01:23:16,160
It should have got me thinking.
All right, so I reckon we can,

1559
01:23:16,960 --> 01:23:18,640
we can leave it at that.
And I'm keen to hear what the

1560
01:23:18,640 --> 01:23:20,920
money miners think.
There was, there was so much in

1561
01:23:20,920 --> 01:23:23,960
that, such a a different style
of conversation.

1562
01:23:24,520 --> 01:23:28,320
So it's good to to hear what
people like Rusty and like Bryce

1563
01:23:28,360 --> 01:23:31,240
think about the the industry and
where it's going and everything.

1564
01:23:31,240 --> 01:23:34,200
I it was a truly wild
conversation.

1565
01:23:34,200 --> 01:23:36,960
I thought, yeah.
I was, I was watching CNBC last

1566
01:23:36,960 --> 01:23:38,960
night.
They had Jensen Hong on from

1567
01:23:39,400 --> 01:23:46,280
NVIDIA and mate, is the, the
feeling you get talking about

1568
01:23:46,280 --> 01:23:49,560
their growth and everything with
their IR chips that they've got.

1569
01:23:49,560 --> 01:23:54,600
And that it is, yeah, it's
friggin interesting to listen

1570
01:23:54,600 --> 01:23:56,640
to.
So you're talking about NVIDIA

1571
01:23:56,640 --> 01:24:00,160
growth outlook.
It's like, well, they're leading

1572
01:24:00,160 --> 01:24:03,280
the pack with these friggin AI
chips that are all going to be

1573
01:24:03,280 --> 01:24:06,720
feeding these data centres.
So yeah, it is a fascinating

1574
01:24:06,720 --> 01:24:08,640
industry.
Without.

1575
01:24:08,640 --> 01:24:11,880
Doubt without doubt came to see
what it does to does to our

1576
01:24:11,920 --> 01:24:13,040
industry.
Hey, yeah.

1577
01:24:13,040 --> 01:24:16,160
Very much so right there's you
don't need AI to get 100 bucks

1578
01:24:16,160 --> 01:24:18,400
off an Aussie MMM underground
operators ticket.

1579
01:24:18,400 --> 01:24:20,200
You don't need it.
Go to the link in the show

1580
01:24:20,200 --> 01:24:22,160
notes.
That's coming up soon mate.

1581
01:24:22,160 --> 01:24:26,480
We've also got mate, we've got
JRX discounts for the Osmo and

1582
01:24:26,480 --> 01:24:30,760
Aussie MMM conference in Brizzy
in mate mate, there's just

1583
01:24:30,760 --> 01:24:33,840
discounts flying everywhere, 190
or 160 off that.

1584
01:24:33,840 --> 01:24:36,680
If you remember right.
You know, our great partners as

1585
01:24:36,680 --> 01:24:40,240
always that just I don't need to
do discounts because they're so

1586
01:24:40,240 --> 01:24:41,840
bloody good, but they'll
probably do it anyway.

1587
01:24:42,160 --> 01:24:46,400
Mineral mining services grounded
CNB ground sports CRE insurance

1588
01:24:46,720 --> 01:24:50,800
guy drill WA water balls Swig
Quattro project engineering

1589
01:24:50,840 --> 01:24:55,920
cross boundary energy.
The information contained in

1590
01:24:55,920 --> 01:24:58,720
this episode of Money of Mine is
of general nature only and does

1591
01:24:58,720 --> 01:25:01,480
not take into account the
objectives, financial situation

1592
01:25:01,560 --> 01:25:03,560
or needs of any particular
person.

1593
01:25:03,880 --> 01:25:06,880
Before making any investment
decision, you should consult

1594
01:25:06,920 --> 01:25:09,960
with your financial advisor and
consider how appropriate the

1595
01:25:09,960 --> 01:25:13,680
advice is to your objectives,
financial situation and needs.