resize method of HashMap source code

resize function

Because HashMap's constructor doesn't make room for internal tables
When the put function is called, if the table is air conditioner, use resize method
In other words, the resize function has to consider any different forms of constructors and constructors with one parameter and two parameters without parameters to call the resize method
The resize method is also called when the number of tables exceeds the threshold value
So the whole idea

  1. Keep the old table

  2. Define and assign the length threshold of the old table. If the old table is empty, the length is 0

  3. Define new table length new table threshold

  4. If there are elements in the old table

    • 4.1 It is not necessary to hash if the maximum length is reached
    • 4.2 otherwise, it can be hashed again
      • 4.2.1 the new table length and critical value are assigned twice as much as the old table
  5. If there is no element in the old table and the old table has a critical value, it is the new HashMap(50) called by this constructor; therefore, newCap=oldThr;

  6. Otherwise, when the table is empty, there is no critical value to specify the length and critical value of the new table

  7. If the new thr has not been specified, if not, it will be called when the old table has a specified threshold value

  8. The previous value assignment is ready for hashing

How to hash again

Probably thinking

  1. If the node in the table is a tree node, call split directly (next article)
  2. If it's not a tree node, it's a normal node
    Here's an interesting place e.hash & oldcap
    Why is that
    According to the source code, it means that every element in the linked list needs to be hashed again why
    First of all, it must be clear which bucket the key is stored in depends on its hash value
    And the hash values stored in the same bucket are not necessarily the same. For example, the element with length of 16, hash of 5, hash of 21
    All elements can be classified by e.hash & oldcap in the same bucket

If e.hash is the odd times of oldCap, the result is that oldCap 89 is the odd times of 16, 89 / 16 = 5, so 89 & 16 is 16. If e.hssh is the even times of oldCap, the result is 0 70 / 16 = 4, so 70 & 16 is 0, that is to say, if any hash value is either 0 or oldCap is not 0, low is responsible for it and high is responsible for it
Assign it to tab[j+oldCap] if it is high

Here are the specific source code Notes

    final Node<K,V>[] resize() {
        //threshold indicates the next value to resize
        //Save the previous table
        Node<K,V>[] oldTab = table;
        //Old table length
        int oldCap = (oldTab == null) ? 0 : oldTab.length;
        //The old threshold is now initialized in the constructor
        int oldThr = threshold;
        //New table length next value to adjust for new table
        int newCap, newThr = 0;
        if (oldCap > 0) {
            //If the length of the old table is greater than the maximum capacity, in short, it is too long
            if (oldCap >= MAXIMUM_CAPACITY) {
                threshold = Integer.MAX_VALUE;
                //Cannot increase length
                //Return to original table
                return oldTab;
            }
            //If the length of the old table is greater than 16, the threshold value needs to be calculated
            //put
            else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&
                     oldCap >= DEFAULT_INITIAL_CAPACITY)
                newThr = oldThr << 1; // double threshold this is twice the old threshold
        }
        //When oldTab is null, if oldthr > 0, the specified initial capacity is indicated
        else if (oldThr > 0) // initial capacity was placed in threshold
            newCap = oldThr;
        //Specify the length and critical value of the new table when the table is empty and there is no critical value
        else {               // zero initial threshold signifies using defaults
            newCap = DEFAULT_INITIAL_CAPACITY;
            newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);
        }
        //Hasn't newThr been specified yet? If not, it will be called when the old table has a specified threshold value
        if (newThr == 0) {
            float ft = (float)newCap * loadFactor;
            //Determine a new threshold 
            newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?
                      (int)ft : Integer.MAX_VALUE);
        }
        threshold = newThr;
        @SuppressWarnings({"rawtypes","unchecked"})
            Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap];
        table = newTab;
        //Re hashing buckets
        //How to deal with the elements in the bucket
        if (oldTab != null) {
            for (int j = 0; j < oldCap; ++j) {
                Node<K,V> e;
                if ((e = oldTab[j]) != null) {
                    oldTab[j] = null;
                    //If there is only one element in this bucket
                    if (e.next == null)
                        //e. Hash & the largest index & there are only two ones, one is zero and the other is itself
                        //Re hash
                        newTab[e.hash & (newCap - 1)] = e;
                    else if (e instanceof TreeNode)
                    //balanced binary tree
                        ((TreeNode<K,V>)e).split(this, newTab, j, oldCap);
                    else { // preserve order
                    //Linked list
                    //Why should the list be hashed again
                    //Because the mapping values of the linked list are the same 
                        Node<K,V> loHead = null, loTail = null;
                        //loHead is lowHead
                        //loTail is lowTail
                        Node<K,V> hiHead = null, hiTail = null;
                        //Highead is hightHead
                        Node<K,V> next;
                        //Traverse each node
                        do {
                            next = e.next;
                            
                            if ((e.hash & oldCap) == 0) {
                                //Head node
                                if (loTail == null)
                                    loHead = e;
                                else
                                //Next node of tail
                                    loTail.next = e;
                                //New tail node
                                loTail = e;
                            }
                            else {
                                if (hiTail == null)
                                    hiHead = e;
                                else
                                //tail
                                    hiTail.next = e;
                                hiTail = e;
                            }
                        } while ((e = next) != null);
                        //Last link set to null
                        if (loTail != null) {
                            loTail.next = null;
                            //Mapping linked list
                            newTab[j] = loHead;
                        }
                        if (hiTail != null) {
                            hiTail.next = null;
                            newTab[j + oldCap] = hiHead;
                        }
                    }
                }
            }
        }
        return newTab;
    }

If there is any mistake, please correct it

Published 9 original articles, won praise 0, visited 111
Private letter follow

Posted on Sun, 09 Feb 2020 05:18:59 -0800 by biodrux