The efficiency of adding and deleting common collections in Java is tested, including ArrayList, LinkedList, HashMap, TreeSet, LinkedHashMap.
The specific set of java is as follows:
Set Type Description
ArrayList: An Index Sequence with Dynamic Growth and Reduction
LinkedList: An ordered sequence that allows efficient insertion and deletion at any location
Array Deque: A Double-ended Queue Implemented by Circulating Array Array
HashSet An Unordered Set without Repetitive Elements
TreeSet An Ordered Set
EnumSet A set containing enumerated type values
LinkedHashSet A set that remembers the insertion order of elements
PriorityQueue A set that allows efficient deletion of the smallest elements
HashMap: A Data Structure for Storing Key/Value Associations
TreeMap A Mapping Table with Orderly Arrangement of Key Values
EnumMaP A Mapping Table with Key Values of Enumeration Type
LinkedHashMap A Mapping Table for Keys/Values Addition Order
WeakHashMap A mapping table whose values are useless and can be recycled by the garbage collector
Identity HashMap A mapping table that compares key values with == instead of equals
1. Adding efficiency test:
The test code is as follows:
int num = 1000000;
ArrayList<StudyRecordBean> als = new ArrayList<>();
long l1 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
StudyRecordBean s1 = new StudyRecordBean();
als.add(s1);
}
long l2 = System.currentTimeMillis();
System.out.println("ArrayList:"+(l2 - l1) + "");
LinkedList<StudyRecordBean> lls = new LinkedList<>();
long l3 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
StudyRecordBean s1 = new StudyRecordBean();
lls.add(s1);
}
long l4 = System.currentTimeMillis();
System.out.println("LinkedList:"+(l4 - l3) + "");
long l5 = System.currentTimeMillis();
HashMap<Integer, StudyRecordBean> hsm = new HashMap<>();
for (int i = 0; i < num; i++) {
hsm.put(i, new StudyRecordBean());
}
long l6 = System.currentTimeMillis();
System.out.println("HashMap:"+(l6 - l5) + "");
TreeSet<StudyRecordBean> trs = new TreeSet<>();
long l7 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
trs.add(new StudyRecordBean());
}
long l8 = System.currentTimeMillis();
System.out.println("TreeSet:"+(l8 - l7) + "");
TreeMap<Integer,StudyRecordBean> trm = new TreeMap<>();
long l9 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
trm.put(i, new StudyRecordBean());
}
long l10 = System.currentTimeMillis();
System.out.println("TreeMap:"+(l10 - l9) + "");
LinkedHashMap<Integer, StudyRecordBean> lhsm = new LinkedHashMap();
long l11 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
lhsm.put(i, new StudyRecordBean());
}
long l12= System.currentTimeMillis();
System.out.println("LinkedHashMap:"+(l12 - l11) + "");
When num = 10,000, the output is as follows:
ArrayList:4
LinkedList:3
HashMap:8
TreeSet:6
TreeMap:18
LinkedHashMap:6
When num = 100,000, the output is as follows:
ArrayList:19
LinkedList:6
HashMap:28
TreeSet:17
TreeMap:90
LinkedHashMap:28
When num = 1 million, the output is as follows:
ArrayList:59
LinkedList:64
HashMap:331
TreeSet:111
TreeMap:356
LinkedHashMap:153
When num = 10 million, the output is as follows:
ArrayList:4770
LinkedList:2761
HashMap:6334
TreeSet:1435
TreeMap:6888
LinkedHashMap:12199
For super data, TreeSet adds most efficiently, followed by LinkedList > ArrayList > others.
For LinkedList and ArrayList, if each data addition is inserted from the middle:
int num = 100000;
ArrayList<StudyRecordBean> als = new ArrayList<>();
long l1 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
StudyRecordBean s1 = new StudyRecordBean();
als.add(als.size()/2,s1);
}
long l2 = System.currentTimeMillis();
System.out.println("ArrayList:"+(l2 - l1) + "");
LinkedList<StudyRecordBean> lls = new LinkedList<>();
long l3 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
StudyRecordBean s1 = new StudyRecordBean();
lls.add(lls.size()/2,s1);
}
long l4 = System.currentTimeMillis();
System.out.println("LinkedList:"+(l4 - l3) + "");
Insert 10,000 pieces of data, each time insert from the middle, output:
ArrayList:10
LinkedList:82
HashMap:10
TreeSet:0
TreeMap:11
LinkedHashMap:3
Insert 30,000 pieces of data, each time insert from the middle, output:
ArrayList:35
LinkedList:671
HashMap:12
TreeSet:43
TreeMap:17
LinkedHashMap:7
Insert 100,000 pieces of data, each time insert from the middle, output:
ArrayList:332
LinkedList:16666
HashMap:27
TreeSet:5
TreeMap:56
LinkedHashMap:27
ArrayList is much more efficient than LinkedList for inserting from somewhere in the middle
Add data summary:
-
For ordered lists:
LinkedList is more efficient if added to the tail; ArrayList is more efficient if inserted somewhere in the middle. -
For disordered sets:
TreeSet>HashMap>LinkedHashMap
2 Reading efficiency test
Tests include: ArrayList, LinkedList, HashMap, LinkedHashMap
The test code is as follows:
public static int num = 10000000;
public static ArrayList<StudyRecordBean> als;
public static LinkedList<StudyRecordBean> lls;
public static HashMap<Integer, StudyRecordBean> hsm;
public static TreeSet<StudyRecordBean> trs;
public static TreeMap<Integer, StudyRecordBean> trm;
public static LinkedHashMap<Integer, StudyRecordBean> lhsm;
public static void main(String[] args) {
Add();
testRead();
}
private static void testRead() {
long l1 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
als.get(i);
}
long l2 = System.currentTimeMillis();
System.out.println("ArrayList:" + (l2 - l1) + "");
long l3 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
lls.get(i);
}
long l4 = System.currentTimeMillis();
System.out.println("LinkedList:" + (l4 - l3) + "");
long l5 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
hsm.get(i);
}
long l6 = System.currentTimeMillis();
System.out.println("HashMap:" + (l6 - l5) + "");
long l9 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
trm.get(i);
}
long l10 = System.currentTimeMillis();
System.out.println("TreeMap:" + (l10 - l9) + "");
long l11 = System.currentTimeMillis();
for (int i = 0; i < num; i++) {
lhsm.get(i);
}
long l12 = System.currentTimeMillis();
System.out.println("LinkedHashMap:" + (l12 - l11) + "");
}
private static void Add() {
als = new ArrayList<>();
for (int i = 0; i < num; i++) {
StudyRecordBean s1 = new StudyRecordBean();
als.add(als.size(), s1);
}
lls = new LinkedList<>();
for (int i = 0; i < num; i++) {
StudyRecordBean s1 = new StudyRecordBean();
lls.add(lls.size(), s1);
}
hsm = new HashMap<>();
for (int i = 0; i < num; i++) {
hsm.put(i, new StudyRecordBean());
}
trs = new TreeSet<>();
for (int i = 0; i < num; i++) {
trs.add(new StudyRecordBean());
}
trm = new TreeMap<>();
for (int i = 0; i < num; i++) {
trm.put(i, new StudyRecordBean());
}
lhsm = new LinkedHashMap();
for (int i = 0; i < num; i++) {
lhsm.put(i, new StudyRecordBean());
}
}
When num=10000, output:
ArrayList:1
LinkedList:58
HashMap:3
TreeMap:7
LinkedHashMap:1
num=50000, output:
ArrayList:3
LinkedList:1089
HashMap:3
TreeMap:20
LinkedHashMap:1
num=10 0000, output:
ArrayList:21
LinkedList:5135
HashMap:26
TreeMap:7
LinkedHashMap:25
num=5000000, output:
ArrayList:14
LinkedList:*
HashMap:71
TreeMap:8717
LinkedHashMap:71
Conclusion:
For reading when data is large, ArrayList > LinkedHashMap > HashMap > TreeMap > LinkedList
summary
Adding data efficiency:
-
For ordered lists:
LinkedList is more efficient if you add data to the end; ArrayList is more efficient if you insert it somewhere in the middle. -
For disordered sets:
TreeSet>HashMap>LinkedHashMap
Read data
When reading more data, ArrayList > LinkedHashMap > HashMap > TreeMap > LinkedList
Reprinted notes: http://blog.csdn.net/u014614038/article/details/72519346
Note that the above is my test results, do not guarantee the correctness, please carefully refer to.