Welcome back. Today we're going to have a look at some of the source code for that ACL tree. The link to the source code I'm going to present in this video can be found on GitHub github.com slash are slash data dash structures. Make sure you have watched the last three videos on the A vl tree, about tree rotations, insertions and removals in ATL trees before continuing so you can understand the source code I'm presenting. I don't normally do this, but I'm going to present a live demo of the A vl tree in action. So I'm here in my GitHub repository.
I'm just compiling some of the Java code with eval tree and then I'm going to run it and you see that it generated a random tree with some values and notice that the tree is relatively well. Once to pour the number of nodes that are in it. So I randomly inserted some notes. And you might expect the treat be a bit more sloppy if it were just a simple binary search tree, but the abl tree really keeps the tree quite rigid. So I think we're ready to dive into the source code now. Here we are in the source code of a recursive EPL tree implementation and the Java programming language.
So let's get started. If you look at the class definition for the HDL tree, you will notice that this class takes a generic type argument, which is of type T which extends a comparable and this generic type I'm defining basically says that the types of values we're going to be inserting inside the tree need to be comparable in a sentence, meaning we need to be able to insert them and know how to insert them. So if you look inside the subclass node, which I've created, you can see that the value we're actually storing inside the node is of type T. So it's a comparable. And that's very important. The other instance variables I'm storing inside the, the node subclass, or the Savior tree is, of course, the bounce factor as an integer, the height of the node, because we want to be able to query the height of a node in constant time, so I'm storing it inside every node.
And then of course, there's going to be the left and the right child pointers Right here. And notice that this, this node class implements the tree printer, Principal node interface. And that's just an interface I have somewhere in this folder that allows me to display the tree I did on the terminal. So this isn't quite a requirement if you're actually building an ACL tree, and nor are these overrides, those are also for being able to display the tree in the terminal, which is really handy for debugging actually. Alright, so instance variables at the AVR tree class level, we have the root node. This should really be private, although I'm using it for testing so I'm leaving it package private.
I'm also keeping track of the number of nodes inside the tree inside this node count variable Something else I'm also using for testing is this height function. Although it's really good to just sanity check yourself to make sure that your eval tree height is relatively low. Then there are these methods like size and is empty which are pretty self explanatory. This the display method a call to actually print the tree in the terminal. This is the contains method to check if a certain value already exists inside the tree. So, this is the public facing method which calls the private method and I give it the root as the initial node to start off with.
So, to check if a node exists inside a tree, if we hit the base case, then we know that that value does not exist. Otherwise, we compare the current value to the value inside node So that returns a comparative value of either a value less than zero, which means that if the value does exist, they'll be found in the left subtree or competitor is greater than zero, meaning the value that exists is in the right subtree. Otherwise, the means that the comparative value is equal to zero. And that means we found a node inside the tree so it can return true. So we're trying to insert this value variable. If it's no, we don't want anything to do with it.
So we just return false. And we return a Boolean for the insert method to indicate whether an insertion was successful or not. So if the value does not already exist inside the tree, then we're going to call the private insert method. And then assign this value to the route and also increment the node count, because we know we're going to successfully insert something if we're inside this block, at the same time, we're going to return true. Alright, let's have a look at the private insert method. So if we reach the base case, meaning we traverse all the way Dollar Tree, and the value of node is null, then we know this is the position where we need to insert that new node.
So create a new node and give it the value. Otherwise, we're searching for the insertion position. So again, invoke the comparator function, then do value compared to no value. This compare two comes from the comparable interface at the class level. So that's why we're allowed to invoke Compared to on a generic type here and another generic type there. So again, if competitors less than zero then insert in left subtree.
Otherwise, and search in the right subtree. And here is the extra two lines you need to add for the vl tree, which is calling the update method to update the balance factor and the height value for this node and also rebalance the tree if necessary. Well rebalance the tree at this node. Alright, so let's look at the update method. So this is to update the balance factor and height of a node. So here are two variables that grab the height of the left subtree and the right subtree.
Then I update the height for this node, so that's always one plus the maximum of either the left subtree or the right subtree height. Then once we have the appropriate values for left and right, we can of course update the balance factor, which is the difference between the right and the left subtree heights. Okay, so we've done the update now we kind of call the balanced method. And inside the balanced method, we're looking to readjust nodes whose balanced factors are either minus two or plus two. In the case of a minus two, this means that our tree is left heavy. And inside the left heavy case, we can run into the left left case or the left right case.
And inside the right heavy case, we have the right Right case and the right left case, and to identify those sub cases, you just check the balance factor, but on either no dot left or no dot right respectively. And if if the bounce factor of the node is not minus two or plus two, then we know that the balance factor is either zero plus one or minus one, which is fine, and we don't have to touch anything. We don't have to rebalance the tree if either one of these is true. So we just simply return the node. So now we can look at the individual cases themselves. So like the left leg case, we'll just perform a right rotation.
The left right case we'll do an initial left rotation, and then call the left left case. So we're reusing methods. We all Already you wrote here since this case degrades to that case, after one rotation and a similar thing happens for the right right case and the right left case, that we get to call the right right case after one, right rotation over here. then here are the notorious left rotation and right rotation methods or the abl tree, it's very important that we call these two update methods to update the height and balance factor values after that we've rotated about this node, because those values will undoubtably change. And it's also important that this is the order you do them and you can't do them in inverse order because you have to update the child node first, to get the correct balance factor and height values for the parent.
Okay, now we're at the Remove method. So in this remove method, we want to remove this element, or I should call it value rather, but still, so if the element is no, well, we know it doesn't exist in the tree, so just return false. Otherwise, make sure that value exists in the tree and then remove it by calling the private remove method, then decrease the node count and simply return true on the Remove method to indicate that the removal was successful. So let's look at this private remove method where all the action is happening. So create the base case, just return null. Then we get our comparative value and we do What we always do, we check if the competitor is less than zero.
And if so we dig into the left subtree. And we know that the value we're looking for is smaller than the current value. And then we recursively call the Remove method, but passing down no dot left, so we're just decreasing the region, we're searching and then passing the element as well. Otherwise do the same thing, but for the right, so recurse down the right subtree. Otherwise, if this is not true, and that is not true, then we know that the comparative value is equal to zero, meaning we found the node we wanted to remove. And we know from the slides that this is where the four cases come up, if no dot left is equal to null, then we know that we want to return no dot right?
Or if no dot right is null, meaning the right subtree is null, then return no dot left. Okay, and then final tricky case where we're trying to remove a node, who still has both sub trees, so the left subtree is there and the right subtree is there. But we still want to remove that node. And we can either pick the smallest value, or the successor can either be the smallest value in the right subtree, the largest value in the left subtree. And I'm using a heuristic right here to actually determine which one I want to do. Just because I think it's a good heuristic and here's what it is if the height of the left subtree grater, I want to remove nodes from that subtree.
But otherwise, if the right subtree has a larger height, then I want to remove nodes from that subtree. So that's the heuristic I'm using to choose which subtree I removed from. And I just made up this heuristic. But I think in general, it'll work pretty well. So what we do, let's say there are or the left subtree has a larger height, than what I'm going to do is I'm going to find the successor value by going down the left subtree once and then digging all the way to the right to find the max value, and then swapping that value into the current node. And then this is The recursive part we recurse.
On removing the successor value instead of the element we passed in, or rather the element we passed in becomes the successor value. So we've removed the element but now we also have to remove this successor value because we don't want duplicate values inside our tree, then we basically do the same thing but on the other side if the condition goes the other way. Here's what I was mentioning in the last video, where you want to call the update and rebalance method on the call back of this remove method, so that the tree remains balanced even though we're removing nodes and these are the find main and find max method I was calling right up here. Which just digs and laughter digs right depending on the case. Here's just an iterator to traverse the tree, I don't think I need to go over this. And here's another method for testing which validates the binary search tree invariant also not particularly important.
And this is what I was calling in the main function that will just randomly insert values inside the tree and then display it. So when I invoked this file on the terminal, this is what executed. So that is an eval tree. It's pretty cool data structure, and I quite enjoyed writing it up. So guys, thank you for watching. If you learned something, please like this video and subscribe and I will catch you next time.