Alright, so last, but by no means least, is machine learning. So machine learning is the overarching terminology that's used for basically all the other buzzy buzzwords that you hear out there. So deep learning neural nets, machine learning, you know, sorry, Ai, all those sorts of things all come under the umbrella umbrella term of machine learning. So, machine learning has actually been around for a number of decades, they had like a AI winter they call it over the sort of, I think 80s or 90s, where, you know, there was a huge amount of hype about it, but then it didn't really play out back in the 80s and 90s. And cyber on kind of just, you know, most excitement in it basically and ignored it for a number of years and why it's become such a huge thing recently, is not because they've done Without any huge, great advancements in terms of, you know, machine learning itself, a lot of the things that we do with stuff like image classification was actually invented quite a while ago, what is very different now is the level of computation power that's available.
So with things like PC graphics cards that were primarily built to play games, and, you know, for the gaming market of PC, you know, builders and all that sort of stuff, and Doom and Quake, and you know, Grand Theft Auto and all those sorts of things, those graphics cards can actually be utilized to do machine learning, as well as deep neural networks and all this other sort of stuff. So it's actually in part because of gaming essentially, and the rise of that and, you know, millions and millions of people buying graphics cards and paying for that research and development to build these faster and faster graphics units that has made machine learning possible. Along with that, along with those graphics cards that are much cheaper. And much more powerful and have much more memory in them. There's also the other side of it, which is actually big data.
Now, big data, you know, obviously came about after sort of 2000 or 2010, where we started generating huge amounts of data. And this is really important for machine learning, because, you know, all else being equal with the same sort of programming and the same sort of, you know, base system, if you have, you know, 10,000 images of a cat versus, you know, a billion images of a cat, you can train that same same system are much, much better with the more images and the bigger data. So this sort of collection of data that, you know, companies like Google and Facebook have been able to do with their huge databases, has really sort of made their neural net and then machine learning systems that much better because they have that data to train the system on you know, it's just like, anything that we humans do. If I You know, try and do long jump or something like that, and I have 10 goes at it, you know, sure, I might be able to do it a little bit, but compare that to you know, me, if I tried to do it after 10,000 goes at it, I'm obviously going to be a lot more experienced, I'm going to get much better results.
And it's the same thing for machine learning. Now, when it comes to AI or machine learning, a lot of people think that it's just, you know, niche, certain cases, like maybe, you know, the Assistant on an iPhone, it's very, you know, useful case, sure, but it's not particularly your globally reaching or anything like that. But it's actually very wrong. A lot of, you know, industry leaders have actually speculated that AI is essentially like electricity in that initially when electricity came out. Sure it had its uses but over the next you know, many, many decades it was essentially rolled out and made use of by every single industry, you know, you went from light bulbs to factories cameras to computers to it basically just rolled out and improved every single industry huge men better and they see AI and machine learning as sort of a second coming of electricity, if you will, where, sure, you know, you'll still have your cars, they'll still be driving around, but they will then have AI on them that will dramatically improve their efficiency, it will give them new abilities, you know, obviously, a cars that might be something like self driving cars or fully autonomous cars, but this is the kind of thing we're also seeing it again with cameras where they you know, intelligently check the actual scene you're taking the photo, Waldron determine what the best settings for the cameras are, you know, all these sorts of improvements in efficiency gains for every different type of, you know, piece of equipment, industry, all these sorts of things.
So they're expecting AI to roll out essentially over the coming years across every single industry regardless of what it is regardless of what they do. Now To give another sort of real world example of this AI and machine learning rolling out to existing industries, the example I'd like to use is with Google. Now, they have a lot of data centers out there, as you can imagine, on their servers and cloud computing and stuff that they use. And with a lot of computers in the data center, it gets very hot as per usual, and so they need to call it and you know, make sure the temperature doesn't reach too high. And this takes a lot of energy and a lot of, you know, sort of technical work to sort of balance it and make it as efficient as possible. So you're not just pumping huge amounts of AC into something with, you know, the doors left or an open or something like that.
It's got to be very, very efficient, and you know, they want to save as much money as they can. And obviously, there's been, you know, leading experts in this field and developing this technology and developing these systems for many, many years. You know, computer data centers are not a new thing. They've been around for decades. So this is a well established industry and you would expect Google to be at the absolute fault. front of this with the huge number of data centers and the complexity that they draw.
Now, that being said, Google recently actually developed an AI and actually put one of its data centers, the cooling and the sort of management systems for that cooling in charge of by an AI. So they handed over all control to the AI system and said, you know, here you go, here's all the controls for the air conditioning and the windows or whatever it was. You figure it out, you learn, you optimize as best as you can. And the results were actually quite staggering those 40% drop in how much actual energy it required this AI to, you know, effectively cool and maintain this data center than what Google was using before. Now. That is a huge improvement.
You know, I'm not talking about Google refitted there, you know, data center with more efficient air conditioners or, you know, installed some new fancy technology. Hardware wise, it was using the exact same hardware and cooling systems that were there before, it was just a software change. So these are the types of improvements that can be seen in, you know, even industries as benign as just, you know, air conditioners and cooling and that sort of stuff. It's not super exciting. But, you know, if you can shave 40% off your electricity bill, hell even at home, you know, that's a huge win. But if you're a company 40% off something called your cooling bill for a data center is an enormous amount of money.
So these are the sorts of, you know, performance improvements and gains that AR can get in it's not just in that, you know, they can now beat humans in poker, they can beat humans in go, they can beat humans in, you know, chess in jeopardy and it's just getting more and more and more of these things that AI and machine learning are getting much better at than even the best humans in the world are. Now, this is sort of the current day abilities of AI this isn't you're talking about the future. or anything like that this is what is actually happening today in, you know, artificial intelligence and machine learning. And that's why I'm sort of quite bullish on where this technology is going and how much potential this technology has. And currently, a lot of the positions are just in programming and, you know, developing these machine learning algorithms and, you know, improving them and rolling them at systems.
People are frantically trying to do this at as faster rate as they can, you know, when you've got huge companies making savings, like 40% off their electricity bill, you know, any sort of company is going to be scrambling to try and match that or better, you know, it's not just from the pure perspective of, you know, saving money. It's also you know, that's, that's business level stuff. If your competitor is, you know, doing something 40% cheaper than you then you're a significant disadvantage to them. So there's a lot of competition going on that if you know, one company rolls out this fantastic new AI or machine Learning feature, their competitor has to as well as enjoy it, otherwise they will just pale in comparison to them. So this hiring of computer programmers and artificial intelligence programmers and lots of stuff, people that know and breathe, machine learning is actually almost gotten to the point of, you know, celebrity status where these people that are absolutely leaders in their field can command basically whatever wage they want.
So it's a very, very, you know, important and, you know, current day developing field and should just get even bigger in the future. Now, this demand for, you know, machine learning programmers, obviously, there's a huge demand for it within paying these high prices, which means there's basically limited or not enough supply of these machine learning programs. So there's a big opportunity right now and many years into the future I would expect for anyone who's you know, kind of interested in it. Maybe your program already. You know, your jobs not really going anywhere too much. If you can start learning that machine learning programming, you open up a huge new category with much higher wages with much more demand, you know, it's a big sort of shift for you, but you're still doing your core skill of maybe computer science or programming.
Maybe you're just interested in the whole field itself. Maybe you aren't a programmer, but, you know, you can still sort of start doing courses online stuff about it. There are many, many courses out there that will teach you both paid and free about machine learning. There's even my own course on how to build your own deep learning PC, so you can check that out as well. But that is pretty much it there, you know, the whole course and bonus content as well. So we're pretty much done.
If you want even more stuff as they go to my website. Alex Schulman comm lots and lots of free guides there to you know, tell you even more about upcoming technologies, keep track of the industries that are Sort of mentioned before things like drones, machine learning, Ai, nanotechnology, all that sort of stuff. Lots and lots of information there that's continuously getting updated. So feel free to have a look at that. And I hope you've enjoyed all the content. And I'll see you next time in one of my other courses.
Thanks