From a very young age, Stephen’s parents knew that he was likely to become an engineer. He didn’t know exactly what it was that engineers did, but he knew they were responsible for making things, which he knew was the space he wanted to be in…
I was always very passionate about it and my family encouraged that passion.
Extra discussions during the episode
Future: The future is going to be disruptive, but it will be in the benefit for the environment.
I’d like to see a real disruption in how we form construction in Australia
Advice: Something that guided Stephen in his career, follow your passion.
Find something that you’re passionate about with engineering
SpaceX Falcon 9
Gave me a bit of a passion for space travel
Leonardo Da Vinci
He wasn’t just an engineer… he was an engineer in multiple disciplines
Stephen Bornstein is the founder and CEO of Cyborg Dynamics Engineering, a Robotics and AI company.
Stephen has over 10 years professional experience having worked on the Nulka Active Missile Decoy upgrade program (Australia’s largest defence export), the electron launch vehicle at Rocket lab during it’s early years as a senior test engineer, and on MRH90 entry into Special Forces at Airbus.
Outside of his employment Stephen developed a drone to counter rhino poaching in Africa, lead the design of a cockpit simulator for the Royal Flying Doctors Service education program and co-founded the Engineers Without Borders Makerthon event in Queensland in 2018 which he co-lead.
Stephen founded Cyborg Dynamics Engineering in 2017 to enable high tech projects to take place within Australia having seen so many great Australian engineers choosing to work overseas.
In 2017 Stephen won the Australian Young Professional Engineer of the year for his services to the industry and profession.
And in 2020 he was recognised on the 30 Most Innovative Engineers list.
This is a “close” copy of the words that were spoken during the Podcast, Season 4 Episode 8
It is not 100% accurate.
The guest was Stephen Bornstein
Stephen: [00:00:00] I’ve wanted to be an engineer ever since I was very young playing with Lego is a little kid and then programming my graphics calculator in high school to solve all the problems with the equations all written out. So you could just copy, copy it straight off the calculator and moving into some more mechatronics type stuff, building little hovercrafts.
[00:00:18]So I think my parents knew I was going to be an engineer from a very young age.
[00:00:21]Mel or Dom: [00:00:21] do you have any other engineers in the family or were there other engineers around that you knew.
[00:00:25]Stephen: [00:00:25] I don’t think I had a direct role model per se, but I was always very passionate about it and my family encouraged that passion. So I think that was probably part of the reason, but I was that obsessed engineer from a young age.
[00:00:37] Mel or Dom: [00:00:37] So what was the first project you worked on after you graduated?
[00:00:40] Stephen: [00:00:40] So the first project I worked on was the Nulka Active Missile Decoy. We were doing an upgrade on that, which was the largest upgrade we’d done in 10 years. And it was the first time we were going to do flight trials on what is Australia’s largest defense export. It’s a hovering missile that protects ships.
[00:00:57] So I was straight in the deep end and I loved every minute of it. So the first year, a year and a half was fairly intense. We did wind tunnel testing and mechanical design and acceptance testing and going from an idea to actual flight trials about 18 months later was a really amazing experience.
[00:01:15]Mel or Dom: [00:01:15] That’s an awesome first project to work on.
[00:01:18] Stephen: [00:01:18] Yeah, pretty blessed. And I think that probably propelled the rest of my career from there as a result of just the opportunities I had as a graduate.
[00:01:26] Mel or Dom: [00:01:26] And that was through a private company. You weren’t, you you’re not in the military. Were you.
[00:01:31] Stephen: [00:01:31] That’s correct. So I’m an army reservist, but that’s not as an engineer. it was through BA Systems Australia and it was a longstanding program that started with defense science and technology group. And then was commercialized through BA Systems Australia and is now being exported to the United States on Navy ships.
[00:01:52]Mel or Dom: [00:01:52] That’s amazing. That’s such a pivotal first project to get started in. It’s amazing how with all the engineers we’ve spoken to how many cool projects are their first projects. It just amazes me that after you finish your degree, you can step into these sensational jobs straight away.
[00:02:09] It’s almost as though with a lot of the engineers, they haven’t had to build up to those kinds of projects that you stop and go home. They are. I can’t believe I just worked on that. it seems to be something, cause it’s sort of right out of the box. Can I just check though? You weren’t collecting coffee for people doing real work where you, what sort of, what were you doing?
[00:02:28]Stephen: [00:02:28] so I did a lot of site design, firstly, so we’re doing an air data systems upgrade and also an inertial measurement unit. So it measures the angles of the rocket in flight. We were updating from the old iron gyroscopes that spin around , they’ve been around for like a century, and replacing that with an inertial measurement unit.
[00:02:46] And then the air data system got upgraded as well. So I was involved in designing and acceptance testing the pressure vessel to test the air data system, as well as designing test jigs to do wind tunnel testing, carrying out the wind tunnel testing. Performing acceptance testing for data recording devices for the trial, supporting the trial, and then moving on designing the automatic test equipment. Once we moved and put these things into production, churning out about a hundred units per year. So like a low run production type thing, which has sold to the U S and the Australian defense force.
[00:03:21] Mel or Dom: [00:03:21] And you had a lot of support within the company being a graduate. You weren’t left out on your own to do these.
[00:03:28] Stephen: [00:03:28] No, no. So it was a good experienced team , but there were some, you know, as a graduate, a few sleepless nights, where quite a lot of responsibility was put on me at a young age. So I think, I always say to young engineers who want to do research and development, which is what I do a lot of now and commercialization it’s not going to be an easy job. You’re gonna have some times where you’re going to go home and try and figure out how to solve that problem. And you’ve gotta be prepared to obsess over it. So when you come into work the next day, you can actually get it done.
[00:03:57]Mel or Dom: [00:03:58] Nothing like obsessing on a problem. so from such great start, where where are you now?
[00:04:03]Stephen: [00:04:03] So I currently am the CEO of cyborg dynamics engineering, which is a company that I founded. It was founded a few years ago and I’ve been full time with that for about 18 months now, a little bit under, and we do high tech, product development, robotics, and AI is probably our subject matter expertise, but we’ve done a number of other commercialization projects in the robotics space as well.
[00:04:26]Mel or Dom: [00:04:26] Are there specific sectors that you cater towards? Is it defense or biomedical or are there particular areas.
[00:04:33] Stephen: [00:04:33] So I’ll leverage heavily on my defense engineering background. So 80% of our work currently is defense with COVID. It was probably 60 to 70% before COVID where we had a good 30 to 40% of commercial projects, but that’s sort of swung a bit towards the defense work cause that’s a bit more stable in the current economy.
[00:04:52]Mel or Dom: [00:04:52] So being a CEO for your own company, you mentioned that you’re still doing all the research and things, but is it difficult to juggle being a CEO and still keeping your finger on the pulse on that development thing that it sounds like you really do enjoy?
[00:05:09] Stephen: [00:05:09] You just have to be. You have to allocate time management reasonably well. So you have to have a certain part of the day where you’re allocating to business development and a certain part of the day for accounting, you have to delegate things that you don’t enjoy doing to someone else. And then you have to stay on top of the engineering process and then hire really good people that you can trust to actually do things when you’re not around or when you don’t have the capacity to support them fully.
[00:05:34]Mel or Dom: [00:05:34] it sounds like you’ve learnt some really strong lessons already
[00:05:37]Stephen: [00:05:37] Yeah, I haven’t lost too many hairs yet, but you know, touch wood. Time will tell I’m sure you understand, Dom, I gather you run your own shop as well.
[00:05:46] Mel or Dom: [00:05:46] Yeah, I do. I have a lot of gray hair, it came on fairly quickly and it it’s hanging around. That’s for sure. It’s not going anywhere, so it doesn’t go away. It just keeps accumulating. But do you enjoy the business side of it as much as you enjoy the engineering side as well?
[00:06:02] Stephen: [00:06:02] Yeah, I’ve I don’t think I got great exposure probably in the first five or six years of my career to the business side. And I remember when I did start it, it was an insanely steep learning curve, just because I didn’t know the first thing about, you know, maintaining IP and contract law, because I always worked for big companies where we always had commercial managers with law backgrounds and on the accounting that was handled by company accountants who looked after all those things as well.
[00:06:26] So, it was a real crash course in that, but it’s actually good to understand all those aspects of the business, because then you can be better at managing the projects and running the team by having that holistic viewpoint, even things like the social media marketing. If you start as a one person team, you have to have exposure to all of that, which means I think you become a better leader of those people that you then start delegating to in the future.
[00:06:50]Mel or Dom: [00:06:50] Yeah, I think it really, it, aids the culture of the company as well, because then if you’ve been across it, you understand what other people are doing and also what you need them to do so that you can actually get them sort of everyone on the right path, sort of all rowing in the same direction. So it’s always, it’s one of those hard things. I’ve spoken to a few engineers that some of them really have a problem letting go of the engineering and they know even if they were an engineer in a big multinational corporation, like when you start moving up the chain and start doing management and they sort of long for the engineering to actually be able to do the hands on stuff, but it’s definitely a challenge unto itself and that’s a really rewarding part of any business, I think.
[00:07:31] Stephen: [00:07:31] I think one that’s fairly close to us at the moment with cyborg dynamics is in the robotics space and artificial intelligence, is around the ethics of AI and as we continue to push artificial intelligence, what are the ethical implications associated with that? Particularly in certain use cases.
[00:07:47] So when we talk about self driving cars, how do we deal with certain things like moral dilemmas where traditionally engineers have been very black and white people, and they’ve never needed to understand philosophy for instance, but now we’re dealing with quite complex ethical issues that traditionally were handled by philosophers.
[00:08:06] But as we add in this artificial intelligence to our systems and our robotics, we actually need to understand philosophy and moral theories in ways that we never needed to before.
[00:08:17]Mel or Dom: [00:08:17] Like in the past it was a case of if you were driving a car and there was a car accident there was a human input. So there wasn’t that necessity to overthink what was going to happen. Because if, the human had to make a choice in regards to whether they hit that late something else, it came back to human nature, whereas.
[00:08:35] Do you find with AI now it’s almost as though we’re trying to solve problems that we didn’t have before . We didn’t have to consider before.
[00:08:42] Stephen: [00:08:42] Yeah, correct. So if we hit the nail on the head, so I presume you’ve read a few articles and stuff on this autonomous car, like who do you hit? so that’s the classical moral dilemma. So let’s say an accident is inevitable. You could hit the car in front of you that will have a give them a level of protection.
[00:08:57] You could hit a child, you could hit an animal. but then you need to weight. Well, how much damage could you do to those different people that you might hit in that collision? And therefore, what is of a higher priority? Is it. The who you might injur, or the level of damage that you would cause in that injury.
[00:09:15] And you have to look at different moral theories, such as the Maxi Min Function of a contract Terrian model versus potentially utilitarian models. Now, traditionally engineers haven’t needed to have any exposure to this stuff. So we’re drawing on philosophers to try and understand this. Now I don’t work directly in this space, but we work indirectly in it for human in the loop operations, where we have AI are looking at providing decision support or evaluation to help an operator make a better decision.
[00:09:44]Mel or Dom: [00:09:44] Why is it even engineers that are making this decision? I can understand the engineers are possibly going to code it or build it but why are we not talking about having people who have studied this sort of thing, being leaders in the question?
[00:10:00] Stephen: [00:10:00] Yeah. So that’s a good question. And there are some philosophers who have written books on this either. So there’s Derek Liebman and he’s actually consulted to our company and he wrote a book called Ethics of AI, designing a moral algorithm, and he referred to a function of identifying who’s going to be the worst off in, let’s say a car collision and let’s minimize the damage to the person that’s the worst off.
[00:10:21] So he’s come up with that. And it’s engineering companies responsibility to say, well, what moral theory are we going to adopt in the form of an algorithm? And then who’s going to be held accountable for that. Now some people are saying, well, the philosopher who advised might be accountable, but then someone will say, well, it’s the program director who signed off on the code, that’s accountable.
[00:10:42] And then some people are saying, well, there needs to be government legislation on saying this is how we’re going to design autonomous cars in the event of these moral dilemmas. And we’re going to collectively make a decision on what we think is an actual ethical approach to this problem.
[00:10:58]Mel or Dom: [00:10:58] I mean, you got to that point really quickly about the policy decision that will vary from country to country, even state to state, like that will vary. So it could perhaps limit. Using the car analogy as we continue on. Cause there’s a lot of other options of AI, but if you bought a car that was programmed in France, so they might cause in France. Yes. Yes. Okay, good. And they shipped it to Australia. That programming could be completely adverse to the Australian policy, so it would need to be aligned. And so that you’ve got those sort of considerations involved as well.
[00:11:36] Stephen: [00:11:36] Yeah. So it’d be like a, you’ve got local automotive policies and regulations. So it may be that the AI is incorporated in those local policies and regulations and standards that are associated, you know, like we might have a mission standards and in Australia you’ve You’re driving on a different side of the road.
[00:11:52] So the cars in the opposite side and those sorts of things, and it may be that you then have code associated with what is deemed to be an acceptable level of autonomy in that platform and what alerts needed to be provided to the humans and when they need to take over and those sorts of things.
[00:12:06]Mel or Dom: [00:12:06] Do you find as AI evolves, that the philosophical problems around it, are getting bigger and more complex or are they starting to hone it down so that they’ve got stronger solutions and stronger answers to, to these ethical problems?
[00:12:20] Stephen: [00:12:20] I think so it’s a good question. I think at the invent of deep learning, we lost a level of traceability in how a neural network was trained and in doing so, what that resulted in was some ethical considerations, which is okay. The neural network has identified this as a given object for instance, but why has it identified that well, based on a data set and training and optimization, and that’s created a series of essentially neural pathways, within its own deep neural network model, and it’s done the same for speech. It’s done the same for maintenance, predictive maintenance and things like that. So as the networks get more complex, The level of autonomy that I can provide increases, but the traceability of why it made a decision can also become more complicated and therefore you get more complex, ethical questions around it. You know, if we look at Deep Blue being the chess program that beat Gary Kasparov, that was just a very, very number crunching, heavy traditional machine learning as they call it. But it really just assigned a series of weightings and went through a decision tree. But if we look now at Google’s Alpha Go that level of traceability in a deep learning model, isn’t exactly the same.
[00:13:38]we don’t always know exactly to the same level, why it did what it did now that traceability is being improved through a variety of techniques and methods so that we can understand better why deep learning models do what they do, but it’s still in the process.
[00:13:56]Mel or Dom: [00:13:56] It’s almost sounding like we’ve made a cake, but we don’t know how we’ve made it. And so now we have to create programs to understand and pull apart the ingredients to work out. How we ended up with that cake
[00:14:08]Stephen: [00:14:08] Yeah. So I think we
[00:14:10] all like some extent, I think, you know, I can’t go too far because you know, there’s a lot of great work that’s been done with deep learning and understanding deep learning. And a lot of it comes down to your data set and your product assurance and. Good companies have very good product assurance on their AI.
[00:14:26]we’ve invested heavily in product assurance on our AI, we were one of the first companies in Australia to do that in the defense space, particularly in regards to some certification that will come at the end of some of our projects. , but I guess we came up with this great thing with deep learning, but we haven’t taken that step back to say.
[00:14:47] How do we hundred percent trust what we’re coming out of it because whilst it is linear algebra, how it trains and optimizes the system is very different to a traditional decision tree per se.
[00:15:01]Mel or Dom: [00:15:01] Are we going to get to a point where we trust AI implicitly or should we never get to that point?
[00:15:07]Stephen: [00:15:07] I think it’s a good point. Any neural network we develop, we should not roll it into any commercial use unless we trust it. And we have verification and evidence to demonstrate that we can trust that. And that’s a combination of assurance in the data set, assurance based on all the inputs we’ve given it and all the outputs we get. The same way that we’d perform unit testing on any given piece of software. The only difference is what’s sitting in the middle. This AI model is far more complex than traditional decision trees. but it is still a decision tree , it’s just a much more complicated one.
[00:15:43]Mel or Dom: [00:15:43] So, how do you see society moving forward with AI? What’s the solution here? Are we ever going to get to a golden standard of ethics with AI?
[00:15:53] Stephen: [00:15:53] Yeah. So there’s been a number of principles of AI and policies that have been developed by Australian government, CSIRO, a number of major tech companies like Microsoft and Google have all come up with their AI principles. So they’ve sort of got their gold standards or how they want to practice, but I think it will vary on the application of how the AI is provided as to what principles you adopt and what that gold standard looks like.
[00:16:19] So depending on I guess the ramifications of that AI doing something that it shouldn’t do will determine on what that gold standard looks like and the level of trust and the level of assurance that you need on it. I guess like mission critical type applications. For instance, AI and med tech, you want to be able to a hundred percent trust that that’s going to pick up whatever it is it’s looking for when it’s doing vision-based AI on CT scans and so on.
[00:16:44]Mel or Dom: [00:16:44] So you were mentioning companies that have the gold standard like but companies at the end of the day, driven by perhaps. Profit. Um, and, but, but what I’m getting at is one of the things that you you’ve said previously is the government and the policies, the leaders of the people are the ones that will agree and move forward with this.
[00:17:12] So if the companies are saying, this is our gold standards, how are they ensuring that it’s aligned to what humanity wants.
[00:17:19]Stephen: [00:17:19] It’s a good question. And I think what they’ll usually do is they’ll bring in subject matter experts who are not from the company, like philosophy, experts and so on. So that’s what we did for some of our AI projects. We actually bought in a panel of about eight people. And then we actually went to some very large working groups that were organized by the government. And then based on that, we’re in the process of conducting a series of surveys and case studies with the university of New South Wales, ADFA specifically for our applications. So as a company, we’re not really coming up with the concept we’re bringing in consultants, ethicists who are essentially advising us on what they think are suitable based on our use cases.
[00:18:04] Mel or Dom: [00:18:04] I like how you circled back around to the philosophers. And it’s kind of, I feel like they’re the ones that are building our foundation almost. Like they’re giving us the framework for the future to build upon, because I suppose even with engineering, it’s the same with anything.
[00:18:16] The client is also the one that sort of seeing the parameters too. Isn’t it? So it’s really a case of whilst we need to consider that as engineers, what the client needs it for their end goal is going to be something that sets a lot of the standards to make sure maintains those levels of safety and traceability and acceptability out there.
[00:18:36] Stephen: [00:18:36] Yeah, absolutely. I mean, you wouldn’t want to jump into an autonomous car if you didn’t know what decision it was going to make in a given scenario. Like that trust that needs to be established between the company and the client altogether is what’s going to enable this to actually move forward because you know, there’s AI that predicts your speech when you type something into Google search, the consequences of that being incorrect is probably not as significant as the consequence of an AI that’s driving your car being incorrect.
[00:19:08]Mel or Dom: [00:19:08] Yeah. And I think that is very key. What you just said is trust it all comes down to humans trusting the computer and the AI that’s put into place. So what are you thoughts on the future of engineering?
[00:19:22]Stephen: [00:19:22] so I think where I see a lot of benefit is, and a lot of growth is certainly going to be in sustainability, renewable energy, coming up with new methods for affordable housing, particularly in Australia. I mean, we’ve been putting houses the same way for a hundred years. You know, project homes. I’d like to see a real disruption in how we form construction in Australia. I’d like to see better transportation methods, certainly between our cities, our high speed rail and things like that. And we’ve got to do these things sooner rather than later. Cause it’s just going to cost more and more for land for renewable energy and so on. If we do it too far into the future.
[00:19:59]Mel or Dom: [00:19:59] I’ve always said things like the construction industry is just ripe for disruption. They’ve been building bricks, even just using bricks. How old are bricks there? I’m going, I’m going to say thousands of years old, like they’ve been building houses with bricks of some description, like mud clay or whatever the same way.
[00:20:20] Yeah. So, I mean, there are techniques out there where there’s opportunities for disruption, but they’re so hard to get up. It just seems as though everyone’s so set in their ways and not only in the building industry, but inquite a few of them, particularly in regards to sustainability and energy and resources that it’s almost as though, Oh, we’ve been doing it like this and it works.
[00:20:39] So let’s just, just keep doing it like this. And I think it’s going to be one of those things that creeps up on you where, um, particularly in regards to climate change where things are getting worse and that’s. Yeah, time passes so quickly that we’ll be sort of sitting here 10 years from now and going, Oh, damn, we really should have done something 10 years ago.
[00:20:56] And I think unless people start getting their heads around that, now let’s say that 10 years is going to pass very quickly.
[00:21:04] Stephen: [00:21:04] So interesting. You say that with climate change. The Queensland young entrepreneur of the year, he sells solar panels for a living . you know, $50 million in five years, he did in revenue starting with $400. You know, it’s, this guy is selling enormous amount of solar panels.
[00:21:19] He’s got a hundred people working for him now. And I said to him, he guys, how do you sell so many solar panels? He goes, with the government rebate and the amount of money they save on their bill each month. I can show them how they’re going to be making money in one to two years. And I find people who are going to be living in their houses for the next five to 10 years.
[00:21:38] And in terms of renewable energy, until it can be done cheaply and affordably for the mass market, and you can democratize it. We just, as you said, we’re going to be stuck in our ways because the giants are going to find an efficient, cheap way of providing mass energy.
[00:21:54] Mel or Dom: [00:21:54] Yeah, you really do. You need that sort of tipping point and a lot of things that you mentioned when you were saying about the future of engineering, it needs that push to gain the momentum into that area. Otherwise people do tend to sit on their laurels a bit. What would you say to people that are just starting out in their engineering degree?
[00:22:14]Stephen: [00:22:14] In terms of the degree on say a find something that you’re passionate about with engineering. There’s so many disciplines, all of it. And look at the things that you want to do, the impact that you want to achieve and align your discipline to that impact. Because if you do you propel yourself a long way and you’ll have a very rewarding career. and that’s what I did. I found aerospace and robotics is the areas that all was passionate about. And then, you know, I’ve been very blessed with what I’ve been able to do in my career as a result.
[00:22:41]Mel or Dom: [00:22:41] Were you at all put off by the fact that we didn’t have a huge aerospace industry in Australia?
[00:22:48]Stephen: [00:22:48] Yeah. Particularly in my final year of uni, I actually sat a number of interviews where I said, I was interested in doing research and development and commercializing new innovative products. And they said we don’t do much of that in Australia in two of the interviews and I didn’t get the job. And that was certainly a little bit confronting. but you know, I endeavored on and I found a good opportunity in the end and, you know, I guess it all happens for a reason.
[00:23:11]Mel or Dom: [00:23:11] Yeah, I think when I went through, we had introduction to engineering. where we had to go out and speak to engineers, which I think was probably the best thing I could have done because it also not only did it give me an idea of what I wanted to do, I met a few engineers where after meeting them going.
[00:23:25] I don’t want to do that. Like I definitely know that’s not the area that I want to be in and it is great. Cause at least I think engineering’s a little bit forgiving too, where you can start out in one discipline. And then sort of end up in another discipline as you go along. So I know even with one of the guys I studied with, he started out in civil and ended up in environmental by the end of it.
[00:23:46] Stephen: [00:23:46] Yeah.
[00:23:46] Mel or Dom: [00:23:46] it has a lot of flexibility by comparison to a lot of other courses. so just to finish up what’s a piece of engineering that impresses you.
[00:23:55]Stephen: [00:23:55] well, having spent some time at Rocket Lab, it would have to be the Space X Falcon Nine with the recovery. So when they did the Falcon Nine heavy and they recovered both rockets at the same time landing side by side, as well as now the fact that they’ve got the dragon crew and they’ve got this really cool space capsule thing, it’s just incredible.
[00:24:16]Mel or Dom: [00:24:16] Have you seen that in person?
[00:24:18] Stephen: [00:24:18] No, no, but just like videos and things like that,
[00:24:20] Mel or Dom: [00:24:20] I thought, I thought you said you were at rocket lab and
[00:24:22] Stephen: [00:24:22] On, I suppose at rocket lab, I worked on electron for about a year and three months, but you know, that gave me a bit of a passion for space travel. and then having seen what they’ve done at space X on rockets has been really impressive.
[00:24:36]Mel or Dom: [00:24:36] you a frustrated astronaut?
[00:24:38]Stephen: [00:24:38] yeah, so that’s a good question. before I went to rocket lab, I think a part of me did want to maybe one day be an astronaut. And then when I was in New Zealand, I saw the best Australian engineers are working with or working for. American owned company in New Zealand, creating jobs for new Zealanders.
[00:24:57] And I was like, if I become an astronaut, it’s just for me. And I’d rather come back to Australia and try and get these Australian engineers back to Australia or keep them here by actually working on some good stuff. So I think that’ll pass when I was about 25, 26.
[00:25:14] Mel or Dom: [00:25:14] Yeah, no, that’s awesome. I think Dom’s still holding on to that dream, actually. I know. Definitely that’s well, and truly past the now it’s um, yeah. Is. An area looks Space exploration that’s something like that fascinates me. And it’s, I love the fact that it’s, it’s evolving so quickly and they’re doing so much great work in that space.
[00:25:37] I like what Stephen just said that he wanted to actually help build the industry in Australia. Yeah. That’s I think that’s great. That is absolutely sensational. We have so many wonderful engineers here to bring it back home that it’s just sensational. Uh, thought for the country. It’s great that you’re actually doing that.
[00:25:57]And to finish off. I’m really interested to find out who is an engineer that you admire.
[00:26:03]Stephen: [00:26:03] it would have to be Leonardo DaVinci from the Renaissance. I think it was because not the fact that he wasn’t just an engineer, but he was an engineer in multiple disciplines, hundreds of years ahead of his time, a painter, you know, all these other skills that he had. And I think that was very inspiring for his time.
[00:26:21]Mel or Dom: [00:26:21] He is here, honestly, and to still feel his presence all these years later, he really did change society. The things that he came up with, the foresight that he had to, with the things that he was thinking about back then, that they’re still applying today.
[00:26:39]it’s pretty amazing. He probably covers off your whole, he’s a philosopher and an engineer as well as an artist and
[00:26:45] Stephen: [00:26:45] Well, I only learned anything about philosophy about a year to a year and a half ago when I started on the AI route. So
[00:26:53] Mel or Dom: [00:26:53] A steep learning curve. I will. Thank you so much for joining us tonight. It’s been wonderful. It’s been awesome.
[00:27:01]Stephen: [00:27:01] thanks for having me really appreciate it.
And thank you for listening to another great episode of engineering heroes. As we present the new dawn of engineering challenges for engineers Australia. You can view our show notes or learn more about our podcast by visiting our website, www.engineeringheroes.com.au.
If you enjoyed today’s show, all we ask you to do is go and tell someone seriously, it’s that easy either in person or write a review, just get the word out, that’s how you can support engineers everywhere.
We look forward to you joining us next week when we bring you in another interview with one of our engineering champions.