Hello and welcome to the delinquent data podcast with your hosts, us lot. There! We've got the op ener. Sweet!
Do rightly do rightly DFR. Do we have a theme for tonight, apart from, you know, us three, are dickheads, and we're gonna talk shit.
I think that's the general theme anyway, isn't it
The same thing we do every night, pinky. I don't think there's been anything. I mean, I think we did have a previously ranged one and we kicked it down the can a couple of times, and it's not a typical thing of we've suddenly drifted away from temporal relevancy. So what has AI fucked up for us this month?
Jason you made a good point already, or I actually, I think this is a good one to get into on it's around AI ethics on the fact that AI ethics just I'm totally stealing your wording. It's beautiful. This is like the new social media influencer is an AI ethics expert.
I've seen more crop up in the last three days. But everyone's suddenly like, Oh, wow, well, yeah,
excuse me. Ive been on the the P 74 or 70 14 working group for a year. And I've been sitting there kind of going, this is all fairly esoteric to say the least, but I mean, there is this sort of glut of it's mostly from what I can see so far, and I'm probably going to get in trouble with the working group for this one, but an awful lot of.
Yeah, 30 years ago, there would be philosophy grands. It's just that this is the, well, how do you, how do you, how did you take the way the system works whenever you don't need to know anything about the system? So,
So if we actually do have an audience, Bolster is referring to the fact that these IEEE working group looking at ethics in empathic intelligence or empathic ethics, intelligent, all sorts of complex.
Yeah, standards for empathic technology, that, then also include ethical explainability of that empathic technology. All gets confusing, Feature creep all over the place.
Yeah, the thing I liked about your point Jesus is that this has become like a social media influencer thing. I will say my position on this, right. I'm really interested in philosophy.
I am a philosophy dork nerd type, and I'm vaguely aware of AI. So my interest in AI ethics, I quite honestly, I'm interested philosophically in AI ethics, but I see huge amounts of discussion around AI ethics. That honestly seems completely meaningless.
I think the, from my perspective, there are a couple of levels in this because, so my research background drifted into using research into sociology and applying it to autonomous systems. So that's not AI ethics. It's also not. Autonomous sociology, either it's stealing or you know you know we were looking for a good pattern that we could model that lowers the information density of a community of humans.
If we can take those structures and go, okay, Switch out the talky talky and the faces and the flags and everything like that. Switch to that for some other kinds of technical identifier that we can then use to literally optimize the communications of a network. Because whenever we talk about how we trust each other or what the definition of a community is, it's generally a load of communication shortcuts.
It's what color is your skin? What style is your hair? What shape is your face? What accent do you speak with? There are ways of saying, are you part of my party?
Yeah. Do you have a beard?
Yeah. Do we have some kind of commonality that can either be a shortcut to, "I know that you have some shared set of shared values with me" or it's or the out-group assessment of going okay.
"Oh no, hes got a Spurs top. I'm not going to get along with him"
Or, in our case, Jase if clearly the, the out-group.
Well, and you can never get rid of the English.
You have hair and no beard, Jase, your hair is in the wrong place. It's upside down, upside down.
Yeah it is upside down isnt it? Yeah. It hadnt escaped my notice...
just to finish up that point.
I mean, there was a talk that I gave to... as part of NIDC, as part of Raspberry Jam, as part of CoderDojo, which was talking about robot psychiatry. And, it's my, my favorite subject because it's completely made up or it was completely made up until a couple of years ago where then somebody started using it.
But it's this idea that was coined by Isaac Asimov in "liar", I think which was a short story that was basically around this idea of an intelligent system a robot that. I think if I remember correctly, it learned how to lie, but the story was quite entertaining because the way it manifested this lying was to tell people what they wanted to hear.
So it was sorry. That was an, it was telepathic. It was a telepathic robot. It could read your mind. And then it told you what you wanted to hear your deepest, darkest desires, and just said that that's what was happening. So it was lying to you, but the ethics of whether you want that system to lie or not is so that's somewhere in this boundary between the technical side of.
How do you literally get the bits and bytes to flow the way you want to versus how do you assess the complex meta-structure of cognition to convince, to explain to this created machine, it doesn't matter whether it's a artificial intelligence or a biological one, you still need to be able to explain to it the thing "that you're doing, I understand why you think it makes sense, but it's wrong". That's basically what psychiatry is or cognitive behavioral therapy or whatever way you want to look at it. And, I think that's a much more interesting area than just going. Oh, "I do AI ethics".
Yeah. Okay. The thing is what you just said and that idea of, you know, the robo-psychiatrists that just tells you what it thinks that you want to hear.
I mean, it's not, did you not just describe Facebook?
No, sorry to be clear. The Robot psychiatrist was I keep forgetting her name and it's really annoying, was the human character in the story who that was her job rule was to analyze intelligence systems that had gone awry one way or another.
So the liar, I can't remember again, terrible memory.
This is why I have things like Facebook to remember friends. But yeah. So it was a fascinating story and it just led on to a series of... this, this overlap between the social and psychological. Which are really just the analysis of complex systems and the technical implementation of those systems.
But if we look at AI Ethics as being, if it's not a social media influencer trend, or, you know, the various other, cynical approaches that we can take to this.
Ethics is a branch of philosophy and a highly complex and important branch of philosophy that I think all three of us are genuinely quite interested in, right and AI ethics is the branch of philosophy that should look at ethics as it applies to artificial intelligence. Have any of you been particularly convinced that any of the people talking about AI ethics are experts in AI or philosophy? Nevermind both.
Speaking as one of them; No
This is the thing I have been looking like to do for the AINI events, I have been looking for quite a while to get an actual philosopher to come and speak. Right. I want to find an actual philosopher. Who's knowledgeable about AI and AI ethics. I can't find one.
Hmm, honestly, cant find more.
So, my beef it's might be from a lot of things. Actually. Its a lot of machine learning and big data and AI in general is it's feeling more and more of an academic field and not a practice, not a practitioners field. Yep. Papers papers are released. We go read the papers. Well, I don't read them anymore. It's like us under the wall.
And they are separating themselves from the real world at a rapid rate of knots. And it's, it's really, really hard to stomach anymore. In all fairness, I'm getting bored of it. Because there's no, there's no business advancement. And that's the interesting thing with Timmnit Gebru who got pushed by Google.
It's well, this is a business, it's a money-making business. So if you're going to start speaking out at it, then obviously you're going to get removed.
But that's a, that's a really good point. And I agree with you that we seem to have this bifurcation piece where it is increasingly academic on the one side, but on the flip side, it's increasingly just being used by industry.
And it's just becoming part of what people do, right? This is where the AI ethics thing kind of grinds me a little bit because. AI Ethics seems to be going and siding with exactly what you're saying with that kind of a weird academic tinge, right? So of all the academic research going on, which probably has almost no real world application on, then all of the AI ethics is sitting over there with the academics, but where is the AI ethics as it applies to business and the business use of AI?
So that's a really good question because I don't actually think it exists because
Capitalism is the answer for all of these.
It is. Okay. So take, take, let's take an example. Let's see what I can think of. I actually know it's not even that. Okay. So there's a book called "Just Business: Business ethics in action" by Elaine Sternberg. And it's going back a bit. Now it might be 97, 98 when this book was out. But within the first chapter Sternberg lays out what business really is and its business is to increase shareholder value over the long-term by the selling of businesses, products, and services.
Of which Google is a business, you know, it's there to make money for its shareholders by whatever means, preferably legal ,"do not be evil" and all that. And I think it will take in this whole do not be able thing, but it's like at the end of the day, the business, you know, if you go back to, there we go, Tesco.
It's about behavior change with all this data. Can I make someone change their behavior and buy a different product? Is it ethical? Nope. Does it happen? You bet it does. Why does it happen? Increased the bottom line,
As, as Andrew, just, as Bolster just said, you know, I mean, ultimately it is the extra capitalism.
Yay. You know, the thing that, the reason I was initially when we had the first wave of. AI ethics being a thing that pers people talked about. I know I've said this publicly before, you know, one of the things that, but the whole concept of AI ethics is it introduces the concept of a discussion of ethics into discussions around business and technology, which honestly, I've never really been aware of before, you know what I mean? It was driving something that I thought could be hugely beneficial to how we look at ethical approaches to business and how we look at ethical approaches to the application of technology generally. Right. And I feel like it's kind of lost its way.
You know what I mean?
It has, I actually think COVID has a lot to do with it. Everyone was getting really excited then COVID happened. And no one mentioned AI afterwards. You know, everyone went back to humanity. Everyone went outside and started walking around trees again, you know, and trying to get away from computers. You know?
They also ended up having their computer as a, as a as a prime mover in the household in a way that hadn't necessarily been the case for everybody before where, you know, the, there wasn't a a meal for the past eight months that hasn't been within camera range of a zoom call. You know, the, the, the collision between work life and home life.
I think, yeah, I would agree. It's kind of a, deemphasized the AI conversation, but ironically enough, it effectively put a, how would you put it, put the, it, it brought it directly to people's homes and mixed the homework environment in a way that hadn't been done before.
Yeah. Totally. And that's like, I was late starting.
Well, I w... we, I am, I live in a weird household apparently according to the rest of this nation. So we normally eat dinner at like eight o'clock at night. So I had to rush upstairs to get dinner. Cause I had to get started to come back down here for half seven. So I get off one zoom call ran upstairs, got some dinner and have come down to another example.
But as you say, this has become. The, the, the computers and the, the interface and the constant usage of the technology has become even more pervasive than it ever was under COVID. But I agree with you that if anything, during COVID, AI ethics has become this weird sidelined academic thing that we're writing papers, but I'm not seeing it applied to business, except for perhaps, well, maybe we should talk about this.
I mean, what are your thoughts on the whole thing was Timnit and Google.
Well, I haven't, I haven't read the paper boat from the small amount. I do know. Yes, there's definitely an ethics argument there, but it all comes down to quality of data, you know, right. At the start with all this kind of thing.
But like I said, I've not really, I've actually, I've just got it in front of me on Technology Review. I've not read it, so I'm not going to go too deep into it. Yes, it's the way it was handled was really bad because some people didn't even know it was happening. It was just, and then obviously it went through this whole Twitter explosion.
Theres a shock. And that was the first I, in all fairness, I'd never even heard of Timnit Gebru up until the 2nd of December.
I had never heard of her and indeed. I wasn't even aware of the Twitter chaos that was happening until a friend messaged me on WhatsApp going, "are you following this? "And it's one of those weird ones where we're trying to be as objective as possible, researching on Google and again, it does kind of come down to that thing. If we put aside any of the identity politics that might be associated with this, it kind of comes back to the point that you made, jase that I think this author has worked with other academic colleagues to say things that are not particularly singing the praises of her employer, and her employer, employers, she no longer works for that employer. And it appears to be an example of yay capitalism, right? So we have a very large, one of the largest companies on earth, who's committed to AI ethics, but when something is done that questions, their ethics, they want to block that publication. Now, it could be we haven't, we haven't read, I haven't read the final publication.
It could be that their complaints are legitimate, but. It's it's the optics on this are not good. And as to say that, if I remove any of the identity politics around this, if we put that back in, then Jesus, this is awful for Google.
Well, you've got to define what awful means for like one of the biggest companies on the planet.
It's not exactly like Google's variance from the original, "don't be evil" slogan is news to anyone. I mean, I would argue that if .. If to take it back to the capitalism yay argument, you know that reality is factored in and it could be that factored into the stock price, factored into the board allocations, factored into anything on that side of things.
There's no, I don't think this is a traumatic event in any way, because if anything, the biggest challenge is that as part of any paper like this, it is automatically a blueprint for what's acceptable in the industry. So if Google is doing something that may be Facebook or Uber or any of the rest of the FAANG or whatever, the FAANG gang is these days you know, they go, Oh, Google's did that, that means I can do it too. Or, Oh, they decided to not do that. Or maybe we should dial that back. I mean, I would agree that the response hasn't been great, but like, unfortunately, this is just the kind of thing that happens whenever you go against the party line.
But that's the question. If we say, Oh, it's awful for Google.
The question is, and again, we take that pure capitalist perspective. Those this affect their bottom line. Does this affect the share price? No. Does it affect public opinion? No, not really. It affects. A certain sector of the public opinion.
It's the same as the Ryan Air effect, you can have positive or negative PR and it won't make much difference.
Well, th the thing that I would think is that, you know, Google, Google held themselves up as these bastions of, you know what I mean? Don't be evil. Was there right there on their website and they removed that and changed their slogan to being "do the right thing". Now, You could be pro Google and say, "don't be evil" as a negative statement, "do the right thing" is a positive statement.
And therefore that's a positive PR spin, but you can also be very cynical and say, do the right thing means make more and more money for Google and don't rock the boat.
Is this what, when you bring academia and business together, It's like oil and water. I know Bolsters laughing.
When you mix, when you mix the one in water, it doesn't explode.
Well, the, the other thing is that if you do it properly, you get mayonnaise. So I don't think there's any, no,
but that's my point. It's more like a match in petrol, if you add academia and business, a lot of the time, let's be honest.
I've got a tough relationship with this one, because I personally think that academia is a fantastic driver of innovation. And that in an ideal world, the. Pure. Whenever I say academic, I don't just necessarily mean the operations within academic institutions, but I mean the, the academic, what the fuck do we call it?
Knowledge economy. Not that I know. I know, I know the value knowledge economy, buzzword. Bingo has always entertained me because. I think it's recently been sort of quasi rebranded as the innovation economy or some bullshit that at least implies that there's something that changes, you know, we all do call to me.
Do we still have the second fastest growing knowledge economy in the UK?
We, yeah. Science park slash Catalyst slash Northern Ireland slash the province.
So is it Northern Ireland? Is it UK?
Europe? Right. Oh yeah. Are we always still part of Europe? Hang on, what date is it.
A few days.
A couple of days.
So I suppose the argument I'm trying to make is that whenever we talk about the innovation economy, the knowledge economy, whatever way you want to talk about it, at some point you have to create new things. And generally those new things don't immediately have a obvious use case. No, I'm not saying a business case.
I'm not saying a market plan. I'm saying they literally do not have a clear use case. It generally starts off as; okay, here's, this is a cool thing. This is funky, or this solves some kind of abstract academic question. That area of the innovation economy, I think we could do with more of because especially locally, there's an awful lot of naval gazing and I am cycling it back to the AI ethics discussion.
There's an awful lot of naval gazing and Regurgitation of the same conversations, and I'm guilty of it, as much as anyone else. I, you know, the, the talking circuit is effectively turned into a bingo card of going right. When is Jase going to talk about Tescos when's bolster going to talk about education, data sets being shit.
When's he going to, you know, it's there is, I mean, also we were doing the, the, UU MSC AI. Workshop a couple of weeks ago. Had to basically rebuild my talk three times over the course of the session because everyone was talking about the stuff that I had in my sort of headline presentation.
I didn't, I didn't touch a thing of yours.
Oh yeah. Because I knew what you were going to talk about too. I didn't put it in mine, but.
That's the thing though. So that's, that's an interesting one. Right? So both of you did lectures at one of our prestigious local universities, right?
You guys bring that industry perspective, which I feel sometimes, well not sometimes I feel consistently is lacking within academia. Right. And I think it's important that people in academia get some kind of insight and industrial industry perspective, but.
One of the things that kind of bugs me about it a little bit is you're asked to come and do this for no reward creates a huge amount of work for you. And in fact, the last time I got invited, I literally declined because it would have required me actually driving 60 miles to deliver that lecture, which would have required me to spend at least a day or two preparing to drive 60 miles to deliver a one hour lecture to drive 60 miles home again.
And I was like, I'm really sorry. I mean, I really want to contribute, but honestly, you're asking too much at that time I was working as a consultant. Right. And I literally had to sit down and map out what is the cost to me to do this by not doing other business. And it was just unsustainable for me to do that.
Right. So I mean, one of the things that bugs me is the fact that you guys had to put so much time and effort into going in, and supporting the universities to give them the bit that they're missing in their prospectus.
I don't actually think it's that simple. It's is, I don't even think it was not useful for the students.
Is an honest argument that, I mean, I I'm very biased here, but I would much rather operate on a one-to-one or a, one-to-group conversational classical seminar model then having presentations because the irony is now that we're in this lovely COVID situation. Everyone's doing remote meetups, everyone's doing remote lectures.
And I'm wondering what's the fucking point whenever everybody has access to YouTube. So it's not that you're. The meetup circuit is kind of fall... fallen back to, Oh, we just wanted to find somebody from somewhere in the world to talk about it, just using local people. And there's been some of the groups that have taken this brilliantly and gone; right. Okay. We are the, you know, Belfast JS user group, whatever that, whoever it is. And rather than focusing on the Belfast speaker circuit, they're bringing in the international speaker circuit to a Belfast audience. That's fine. But I think it misses the fun discussion, which is what I, what I got whenever I was a student; the most that I've gone on was whenever I was able to talk to industry experts about, you know, actually being able to talk through a conversation about how I was approaching a problem or get them to talk about their problems and then respond and how I would you know, we would think about it, you know, that, that connect, it has to be an actual conversation.
It can't just be a lecture.
Yeah. It's I mean, I think that's a really interesting thing. It almost raises, and I think, I mean, this is, this is a much more sort of broad ranging conversation than anything just to do with data and AI. For me, it raises that fundamental question of what is the place for further and higher education as we move forward.
Right. So, I could go so well again, I went to University of Ulster and I did my degree in PhD in U niversity of Ulster. Had a great time, really enjoyed it, good education, no complaints. But if I were to take a step back and say, if I were 19 years old today, but somehow had the world experience that I had now, would I go to university or would I do a bunch of Coursera courses taught by genuine world experts?
I would go with the latter, right? I'm going to get a much better education by going to MIT and Harvard and UCLA, and all of the folks from there who are genuine world experts, I can listen to, you know, the people who have invented these technologies teaching you, but I, as opposed to somebody who has read about them or, you know, some, some vague awareness of it.
So the university...
hang on, that's my career!
But, you know what I mean? I mean, it's, this is the thing when it comes down to teaching and the further and higher education establishment role now is it has to be more than just delivering content. It's not just stacking up a bunch of slides. You know, it's, it's not just about imparting knowledge because the knowledge is already out there.
You know, it's, it's no longer something locked in a textbook, right? YouTube videos, there's Coursera, Udemy, you know, I could list 20 different educational platforms that will give you the same knowledge that you can get from your lecture. But the point is that the universities have to adapt and change.
And they're being, they have to be pushed to that by doing this, by doing this remotely, because they're not doing philosophy of teaching anyway. So you then, huh? To say, well, what is your value proposition? What is your USP? That is different between me paying to do a course and university of ulcer Queens versus paying significantly less to do a course on Coursera or Udemy?
So there's a couple of things in there. The easy one that I'll head off first is that we've had this for a while and it's called the Open University and OU has been much maligned in my opinion, even though and it's a surprisingly solid institution. So that's the easy ones out of the way.
For me, the... the knowledge isn't important, right? The knowledge of like I, I still have to sort of sit there and sort of work on how do I do the sort of current law with my hands, unlike my master's was electronics and electrical engineering you know, the, the knowledge, the, the rote memory is functionally useless to me.
Especially like no one of like 10 years plus down the line where, you know, my undergrad is. Useless fun.... I don't think there's any knowledge left from that time in my head. I'll put it that way. However, the, the Meta-knowledge, right? The, the, the knowing how to research on knowing how to ask questions or knowing how to present ideas knowing how to evaluate options that is what I do day in, day out. And I learned those skills from both my undergrad from the PhD and the, that I have never seen a, a MOOC model. The, so not massively online, online course. What's the other O in MOOC?
Massive. I don't know, someone can Google that while I'm running. Too, I've never seen a version of. MOOCs, whether it's Coursera, whether it's Khan Academy, whatever, it I've never seen one that works. Th... that satisfies that interpersonal Meta knowledge. And I personally think that the way that we should be doing it both as an industry in terms of academia, Is that especially, we should actually take advantage of COVID and say, right, we've got the entire workforce working or technology workforce working from home.
And they're probably going to stay there for at least six months to a year, probably at least. So why don't we just like, do, 30 minute match ups between students and random technologists in their field. Just go look, talk to this fucker for awhile. It's not graded, it's not supposed to be a didactic exchange.
It's not supposed to be, I'm going to give you a presentation because if, if the student wants to go and learn about topic X; as you say the resources are out there. The problem is that those resources don't always have a capability of being able to answer questions and follow up and go, how does that apply into situation X?
Or can you explain this to me? This is my background. Can you chop this into something that's digestible for me? And I think that, arguably is where the university should be thriving. And I don't know if they're doing anything or not behind the scenes because I'm not a student and no one's asked me to talk to any kids thats maybe for the best.
Yeah. But it's, it's strange because I had already. Sort of come up through where the universities were adopting, the more "Yey capitalism" approach especially with the glut of international students coming in the reduction of class sizes, the reduction of lecture sizes on basically a drive away from high-end research to the sausage mill on basically the grind of get the students in, get the money, get them out, give them a piece of paper.
That's I, I don't think COVID has done that much to universities. I think that the, the shift that the universities made from being centers of excellence to being paper mills is where things are really screwed.
Yeah. And then, in fairness, i, I agree. I think if anything, all COVID has done is post a position where the question needs to be asked more clearly because the face to face versus online has been removed. So therefore it makes that direct comparison, more relevant. MOOCs stands for massive open online course, which we obviously all knew,
jason, Im really interested, right? So I mean, bolster and I are vaguely recovering academics. Having spent far, far, far, too much of our life in university, you represent someone who's more self-taught right.
you're not an academic, what book, you know, well, you've written the textbook, but you know, I'm not academic, you know? W what are your thoughts on this?
I suppose it w worth going back to where I started, which was I left school in 1988. I was 16. And two weeks later, I started a computing college, but, instead of it being five days a week, doing the academic BTech NVQ level three, which we're one of the first to do. We did it one day a week. We can four days a week we're on work placement.
So I ended up doing electronics as well. My first YTS job, but I was being paid 35 quid a week as well for doing this thing for two years.
That was good money, dude.
Yeah. Bought a lot of records and basically tells and pedals and things like that. Yeah. So that one day a week, we were doing all the usual stuff.
And it branched into two directions. You're either doing sort of some form of electronics engineering, which is what I was going to do or programming. Now, all the programming guys were either doing COBOL or C because in the same way as the universities, you're feeding into business in the locality.
So if there's a demand for COBOL, then you teach COBOL. Off you go. Yeah. And if you teach, see if someone's using C for electronics, then boom, you go and learn C, that's how it was. It was this black and white. Is that okay? So over the course of, I took it a bit of a break from all from 91 to 95. That's when I worked in a record shop, I actually learned more of supply chain and the record shop than anything.
I actually, and I did a blog post about this years ago. The, the manager there taught me the art of prediction, but not on a computer. You know, it was all about watching the customer, watching the buying patterns, analyzing them, and then making educated guesses, you know, and being allowed to be wrong at the same time.
You know? So my, my career has always been sort of this waving path of trying to figure out what's coming next. So fast forward, I moved over in two thousand four. For a few years, there was nothing going on. They didn't know anybody is actually Twitter. That saved me because that's how I met Matt Johnson. That's how I ended up in the room with INI.
And most of the startups around that table are still good friends now, you know? But that network element was missing for a good five years for me, you know, so I couldn't get work. I couldn't find anything. I found out very quickly that not having a degree at that time. Basically meant you were unemployable.
Yeah. Which was really hard for me. Cause I ain't got a degree and I ain't going to get one as much as I occasionally whining about it. I don't have one I've stopped that now because someone said the thing, but honestly, yeah.
But then no, but the thing is Jase , I mean, for me, you are literally the personification of the reason why a degree doesn't matter.
Right. You know what you're doing, you're incredibly successful. You've you honestly know this field better than most people I know. And better than the vast majority of people with degree in many years of experience under their belt, you've written the textbook on machine learning. At this point, you should hold up your book just every, every week, have it sitting there so you can hold it up when it's cute.
And I, but you know, for me personally, but for me, hang on. Bolster speak so we..., just,
There we go, just for the product placement but yeah,
for me sonification <...> You are the personification to me of exactly why you don't follow that traditional course of the traditional.
So I'm going to argue here because you don't need to, but it, doesn't not help.
So for instance, that statement makes a lot more sense. If you've spent a bit of time academically studying a boolean algebra so that many negations kind of helps, but I mean, I would agree that Jace is a poster child for yes. From a practical standpoint, from a literally an engineering standpoint, you can do an awful lot without any formal education and you can, as a, as a well or as a self-driven individual driven by curiosity, interest, and fixing problems as they come along.
And that is the guts of my career over the last 15 years.
Yeah. The danger in terms of now without blowing smoke up his ass anymore. I mean, the danger of, of using. Someone liked jeez, as a poster child for, Oh, we can all be AI engineers.
The problem is that the rarity isn't the AI engineering and the rarity is the self-driven curiosity and the experimental attitude. And ironically enough, that can be, maybe not taught, but it can be curated and it can be cultivated by having communities like academic institutions. So while people like Jase and I'd like to think that I have this anyway and that the university just provided an opportunity for me to take advantage of all of their toys rather than pay like 30 grand for a CNC myself. Like I do think that there, there should be a place for local physical institutions to provide, I hate to use this particular phrase, but for safe spaces for fucking around and, and growing and cultivating a, not just individuals, but communities of individuals, and I'm also even more biased than usual because even though those institutions exist, I disagreed with their implementation of that community of practice and then fucked off and made Farset Labs as a more customized and flexible community of practice. That wasn't intended to be a protest against the institution of academic universities themselves, it was more a critique that I wish something, or I wished the operations of something like Farset Labs were easier to pull off within those kinds of institutions. And for, for the record, having a 3D printer or in a laser cutter in the corner of a room does not make a lab into a hackerspace. What makes a lot of into a hackerspace is having a community of people who can play an experiment and having coverage from some kind of management authority above them to give them a safe protection that says, if you fuck up, I have your back on that's usually what's missing.
No, no, I look, I, I totally agree. And I think this is the thing that, you know, academia is... academias role has to become less didactic and pedagogical and more around exactly what you're saying. It's around building communities, building networks, building relationships, people... building environments, where as you say, people can experiment and fuck up and it's okay.
And they can learn from their peers as much as they learn from their academic seniors, you know, from the, from the lectures. And I think that's actually, that's honestly one of the most key things about it. It's building that environment where you're learning from each other, as much as you're learning from your teachers.
I think that's the bit that the universities are going to have to shift to, to understand that that, that their role is more around building community than it is around building around just simply delivering content because the content delivered, we can do that. Yeah, Coursera, does it really, really well.
So there's another this is a little bit of a a drift towards the personal, but hopefully it'll make sense in the grand scheme of things. I said before about it, is it has been depressing to come up through my career during a phase where universities are getting further and further away from individualized relationship building education and drifting towards, I have a lecture hall with 250 people in it. And I'm going to give you a 90 minute lecture and you can maybe visit my office between these 40 minute windows twice on Tuesdays. That has been sad because I wrote them off most of my academic career or the early part of my academic career. I got ... simultaneously got away with and learned more from individual relationships with individual lectures. Then through almost any of the lectures and like the actual ability to. Talk with a expert in their field, whether they're a professor or a reader or whatever, it doesn't really matter. But talk about basically talk to them, talking to a grown-up because we've got to remember that we're talking about 18 year old kids coming in that have unfortunately been brought up on whackos like us doing remote school visits and saying, Oh, the world is your oyster and everything's going to be fantastic.
They just expect to be handed. Oh, by the way, you're not a startup billionaire well done. But unfortunately there are a few steps in between and that can be quite challenging for people. I don't.... The idea that you can scale that and just say, right, we've, we've upped our class sizes from 20 to 40, and there's going to be no deficiency in education quality. That's bullshit.
Realistically, if it, if all I was in charge of education, if I was, if I was dictating education policy, I would throw a little bit of a strange one in. I didn't notice that this is, if I say dictate as in, nobody can question it that we shouldn't necessarily focus on getting everybody up to the same wonderful level all the time was, should be creating environments that curate and cultivate maybe a top 2, 3, 4% of people who are then able