Priority queue source code

15 minutes
Share the link to this page
Copied
  Completed
You need to have access to the item to view this lesson.
This is a free item
$0.00
د.إ0.00
Kz0.00
ARS$0.00
A$0.00
৳0.00
Лв0.00
Bs0.00
B$0.00
P0.00
CA$0.00
CHF 0.00
CLP$0.00
CN¥0.00
COP$0.00
₡0.00
Kč0.00
DKK kr0.00
RD$0.00
DA0.00
E£0.00
ብር0.00
€0.00
FJ$0.00
£0.00
Q0.00
GY$0.00
HK$0.00
L0.00
Ft0.00
₪0.00
₹0.00
ISK kr0.00
¥0.00
KSh0.00
₩0.00
DH0.00
L0.00
ден0.00
MOP$0.00
MX$0.00
RM0.00
N$0.00
₦0.00
C$0.00
NOK kr0.00
रु0.00
NZ$0.00
S/0.00
K0.00
₱0.00
₨0.00
zł0.00
₲0.00
L0.00
QR0.00
SAR0.00
SEK kr0.00
S$0.00
฿0.00
₺0.00
$U0.00
R0.00
ZK0.00
Already have an account? Log In

Transcript

All right, Welcome back, everybody. This is part five of five in the priority queue series. And we're going to have a look at some source code for the priority queue today. So if you want the source code, here's the GitHub link. With all the data structures in this series, the primary key is one of them. Also, make sure you've watched parts 124, so you can actually understand what's going on.

Alright, let's dive into the code. Alright, here we are inside the source code. So notice that inside my priority queue, the types of elements I'm allowing inside my priority queue have to be comparable elements as we talked about. So if they implement the comparable interface, then we are going to allow them inside our queue. So this is anything like strings, integers, anything with a comparable interface. So let's have a look at some of the instance variables.

So I have the heap size. So this is the number of elements currently inside the heap. But I also have another instance variable, which is the heat capacity. So this will be the size of the list that we have for our elements, which may be larger than the heap size is. And this is the actual heap, and we're going to be maintaining it as a dynamic list of elements using Java's list. Next, for our log of n removals, I'm also going to keep track of this map.

So we can map an element to a tree set of integers. So these are going to be all the positions inside our heap, which we can find this element T. Alright, next I've got a few different constructors for Our priority queue, we can just create a priority queue. And by default, I'm creating and initially empty priority queue with a capacity of one. But I also allow you to create a priority queue with a defined initial capacity, which actually is more useful, because then we don't have to keep expanding our dynamic list every time. So I would recommend this. But also even better, is if you know all the elements that are going to be inside your heap, you can actually construct the priority queue in linear time using an operation called heapify, which I didn't talk about in the slides.

But it can be very, very useful. So So this just has all the usual setup stuff. Here I'm adding all the elements to the math and but also to heap. And then here's the heap of I process. So we start at about halfway through the heap size, and then we start decreasing, and then we sync all the elements. And you're like, Wait a second, isn't the seek sink?

A logarithmic removal? Well, yes, it is in the general case, but not if we're doing it in this fashion. I couldn't leave this paper up here, just because the heapify operation isn't quite intuitive why it has this linear complexity. And if you look at the analysis in the paper, you end up seeing that the complexity boils down to a convergence series, and that's why we constantly say it's linear time. But in general, this isn't what you might be tempted to do if you're giving a collection of elements. You will to initialize the heap and then you would just use our add method to add the elements one at a time.

And this will give you an n log and bound, but definitely use the heapify whenever possible. Okay, now some pretty simple methods we have is empty, just returns true or false if heaps empty or not. Then clear. When we clear the heap, we remove all the elements inside our heap array, but also inside our map. So that's why it called map dot clear. Size returns the size pretty simple.

Peek first, really useful method just looks at the top of our primary priority queue. And if it's empty returns No. Otherwise, we do a look up at the first index inside our heap and return it because it's the root node A poll. Similar to pique, except that we're going to remove, remove the very first element. And we're also going to return it because you want that information. x contains.

So because we have a map with all our elements, we can do a map lookup to see if our elements inside heap. And this reduces our complexity from linear, in which case we have to scan do a linear scan through all the elements to check containment to constant time, which is remarkable. But in the general case, people don't usually maintain this map. I just wanted to do it. Just to show you guys that is possible. Although the map does add a lot of constant overhead and you may or may not want I personally find that the overhead is quite a lot.

And I usually don't really remove things very frequently from maps. So it might not be entirely worth it. But it's up to you if you're doing as many additions as you are removals definitely worth it. All right, now let's have a look at the Add method. So, so this element, sorry, this method adds an element to the prayer queue. And that element cannot be no.

So what we do is, first we check if our heap size is less than capacity. Otherwise, we have to expand our capacity of the heap. Next, we make sure we add it to the map. So we keep track of it, and then we swim it up. Remember that we have to swim a note up because be added to the very end of our list. And so we have to, like adjust where it goes inside our heap by swimming upwards.

This next method called less is is a helper method which helps me check if node is less than or equal to node j. And this uses the fact that both elements node one and node two are comparable. So we can invoke the Compare to method. If we go back up that comes from this interface, that comparable interface, which we needed, so let's go back down. Alright. So returns true if i is less than or equal to J.

Awesome. So next, this is the bottom up notes. So we are going to try to swim node k. So first who grabbed the parent of node k, you can do that by solving for the parent. So remember that I am working in, I'm starting at zero. Some people like to start their heaps index, that one, I like everything is zero based. So I get the parent, which is at this position k minus one divided by two, because we're going upwards.

And while K is still greater than zero, so we're not at the root, and we're less than a parent, then we want to swim upwards. So we're going to swap the nodes parent and K. And then case when we can become the new parent. And then we have to get the parent of K once more. And then we'll keep doing this going upper heat and swapping notes. So that's how we do the swim. So the sync is similar, but a little different.

So it's a top down node sync. And here, we want to sync node k. And what we do is I grab the left node, but I also grab the right node. Remember, we're working zero based. So plus one plus two instead of plus zero plus one. And then I need to figure out which is less either it's going to be the left one or the right one. And I assumed to start with that the left one is going to be smaller than the right one.

Here, I correct that hypothesis, in case it was false. So I checked that the right notice within the size of the heap, and if the right node is less than the left node, then the smallest ones are going to be the right now and our stopping condition Is that we are outside the bounds of the tree, or we can't sink any more. And we can do a similar thing we swap and then case the next smallest, like, like we did in the last method, also, the swap method, I made a, an explicit method to swap because I also have to swap things inside the map and then set the values also. And this is really what adds a lot of overhead for the math is the fact that every time we call the swap method, we also swap things inside the map, which, which can be a lot of overhead really, it.

Technically, maps are constant lookup times but the fact that you're doing all this internal hashing and collisions and whatnot, it can get costly. So remove. So if the element is no return false, we know we don't have any no elements inside our heap. So this is how you would do a linear removal and linear time, I commented out in case you want to revert back and remove the map and whatnot, is to scan through all the elements. And once you find the element you were looking for, just remove it at that index and return true. Otherwise, we're going to use our map.

So get the index of wherever the element one of the elements are. And if it exists, then remove it at that index. Okay, now let's have a look at the Remove add method. So this is what I covered in the last video. So if our heap is empty, well can't really remove anything otherwise We're going to swap the index of what we remove with the last element, which is going to be at heap size. And then we're going to kill off that node and also remove it from our map.

So if it so happened that I was equal to the heap size, meaning we just removed the very last element in our heap, just remove, return the removed data. Otherwise, when we did the swap, we have to either sink that node up or down. And I'm kind of too lazy to check whether I need to sink or swim. So I just tried both. So first, I try sinking and then if nothing happened, then I try swimming downwards. And in either case, return the Remove data.

Perfect. So this just readjusts where, where the swap node position goes either bubble up or bubble down. This method is just a method I use in my testing framework to make sure everything is good. So it checks essentially the integrity of the minimum heap. So initially, you call this method with K equals zero that starts at the root as you want to recursively go down the tree and check are we maintaining the heap invariant property which we need. So our base is going to be that case outside the heap size.

And if so, we're just going to return true. Otherwise, get our children left and right. And now we're going to make sure that k is less than Both our children. And if that's not the case, return false. And if we ever returned false, because we have an and statement when we're recursing right here, that that gets propagated throughout the recursion, and this whole method will return false. Otherwise, if everything returns true and hits the base case, then we know for sure it's in the minimum heap.

Okay, these last few methods are just map helper methods. So things, add things into the map things, how to remove elements from the map and so on. And what I'm doing here is I'm using a tree set to add and remove elements because I know the tree set implementation Java as a Balanced Binary Search Tree. So all operations on tree sets are logarithmic, which is really nice. So, so yes, now have a look at those in more detail. It just gets values removes values.

And lastly, do a map swap. So yeah, swaps values in the heap, or in the map rather. So yes, have a look at those in more detail. But I pretty much covered everything about the priority queue. So if something's unclear, just drop a comment. And I'll get back to you as soon as possible.

But that's the heap. So the binary heap implementation of a priority queue, remember that the source code is on my GitHub repo, so have a look at that. Guys, thank you for watching and I'll catch you next time.

Sign Up

Share

Share with friends, get 20% off
Invite your friends to LearnDesk learning marketplace. For each purchase they make, you get 20% off (upto $10) on your next purchase.