PL/SQL batch statements: BULK COLLECT and FORALL's contribution to optimization

Keywords: SQL Database Session less

We know that running SQL statements in PL/SQL programs is expensive because the SQL statements are submitted to the SQL engine for processing.
This control transfer between the PL/SQL engine and the SQL engine is called context-switching, and each time it is switched, there is additional overhead.

See the following picture:

       

However, FORALL and BULK COLLECT allow the PL/SQL engine to compress multiple contexts into one, which dramatically reduces the time spent on executing SQL statements that process multiple rows of records in PL/SQL.
Look at the following picture again:

       

Here's a detailed description of the two men
       
1. Accelerate queries through BULK COLLECT
          
Usage of BULK COLLECT
             
BULK COLLECT enables query results to be loaded into collections at once rather than processed one by one through cursor
BULK COLLECT can be used in select in, fetch in, returning into statements
Note that when using BULK COLLECT, all INTO variables must be collections
             
Give me a few simple examples:
             
Use bulk collect in select into statement

  1. DECLARE   
  2.   TYPE sallist IS TABLE OF employees.salary%TYPE;  
  3.   sals sallist;  
  4. BEGIN  
  5.   SELECT salary BULK COLLECT INTO sals FROM employees where rownum<=50;  
  6.   --Next, use the data in the collection  
  7. END;  
  8. /  

(2) Using bulk collect in fetch into

  1. DECLARE  
  2.   TYPE deptrectab IS TABLE OF departments%ROWTYPE;  
  3.   dept_recs deptrectab;  
  4.   CURSOR cur IS SELECT department_id,department_name FROM departments where department_id>10;  
  5. BEGIN  
  6.   OPEN cur;  
  7.   FETCH cur BULK COLLECT INTO dept_recs;  
  8.   --Next, use the data in the collection  
  9. END;  
  10. /  

(3) Using bulk collect in returning into

  1. CREATE TABLE emp AS SELECT * FROM employees;  
  2.   
  3. DECLARE   
  4.   TYPE numlist IS TABLE OF employees.employee_id%TYPE;  
  5.   enums numlist;  
  6.   TYPE namelist IS TABLE OF employees.last_name%TYPE;  
  7.   names namelist;  
  8. BEGIN  
  9.   DELETE emp WHERE department_id=30  
  10.   RETURNING employee_id,last_name BULK COLLECT INTO enums,names;  
  11.   DBMS_OUTPUT.PUT_LINE('deleted'||SQL%ROWCOUNT||'rows:');  
  12.   FOR i IN enums.FIRST .. enums.LAST  
  13.     LOOP  
  14.       DBMS_OUTPUT.PUT_LINE('employee#'||enums(i)||':'||names(i));  
  15.     END LOOP;  
  16. END;  
  17. /  
  18.   
  19. deleted6rows:  
  20. employee#114:Raphaely  
  21. employee#115:Khoo  
  22. employee#116:Baida  
  23. employee#117:Tobias  
  24. employee#118:Himuro  
  25. employee#119:Colmenares  

BULK COLLECT's optimization of large data DELETE UPDATE
             
DELETE is the same as UPDATE.
             
Examples:
10 million rows of data need to be deleted from a large table with 100 million rows
Requirements are completed as quickly as possible with minimal impact on other applications of the database
             
If the business cannot be stopped, the following ideas can be consulted:
According to ROWID fragmentation, reuse Rowid sorting, batch processing, return table deletion
This is the best way to choose when the business can't stop.
Controlling submission within 10,000 lines generally does not cause too much pressure on rollback sections
When I do large DML, I usually choose to submit one or two thousand lines.
Choosing to do it at the peak of the business does not have much impact on the application.
The code is as follows:

  1. DECLARE  
  2. --Press rowid Sorted cursor  
  3. --The deletion condition is oo=xx,This should be decided according to the actual situation.  
  4.  CURSOR mycursor IS SELECT rowid FROM t WHERE OO=XX ORDER BY rowid;  
  5.  TYPE rowid_table_type IS TABLE OF rowid index  by  pls_integer;  
  6.  v_rowid rowid_table_type;  
  7. BEGIN  
  8.   OPEN mycursor;  
  9.   LOOP  
  10.     FETCH mycursor BULK COLLECT INTO v_rowid LIMIT 5000;--5000 Line submission once  
  11.     EXIT WHEN v_rowid.count=0;  
  12.     FORALL i IN v_rowid.FIRST..v_rowid.LAST  
  13.       DELETE t WHERE rowid=v_rowid(i);  
  14.     COMMIT;  
  15.   END LOOP;  
  16.   CLOSE mycursor;  
  17. END;  
  18. /  

(3) Restrict the number of records extracted by BULK COLLECT
             
Grammar:
             FETCH cursor BULK COLLECT INTO ...[LIMIT rows];
Where rows can be constants, variables, or the result of evaluation is an expression of integers
             
Assuming you need to query and process 1W rows of data, you can use BULK COLLECT to fetch all rows at once and then fill them into a very large collection
However, this approach consumes a lot of PGA in the session, and APP may suffer from performance degradation due to PGA page changes.
             
At this point, the LIMIT clause is very useful to help us control how much memory the program uses to process data.
             
Examples:

  1. DECLARE  
  2.   CURSOR allrows_cur IS SELECT * FROM employees;  
  3.   TYPE employee_aat IS TABLE OF allrows_cur%ROWTYPE INDEX BY BINARY_INTEGER;  
  4.   v_emp employee_aat;  
  5. BEGIN  
  6.   OPEN allrows_cur;  
  7.   LOOP  
  8.     FETCH allrows_cur BULK FETCH INTO v_emp LIMIT 100;  
  9.       
  10.     /*Data Processing through Scanning Sets*/  
  11.     FOR i IN 1 .. v_emp.count  
  12.       LOOP  
  13.         upgrade_employee_status(v_emp(i).employee_id);  
  14.       END LOOP;  
  15.       
  16.     EXIT WHEN allrows_cur%NOTFOUND;  
  17.   END LOOP;  
  18.     
  19.   CLOSE allrows_cur;  
  20. END;  
  21. /  

(4) Batch extraction of multiple columns
             
Demand:
Extract all specific traffic information in the transportation table that consumes less than 20 km/RMB of fuel
The code is as follows:

  1. DECLARE  
  2.   --Declare collection type  
  3.   TYPE vehtab IS TABLE OF transportation%ROWTYPE;  
  4.   --Initialize a collection of this type  
  5.   gas_quzzlers vehtab;  
  6. BEGIN  
  7.   SELECT * BULK COLLECT INTO gas_quzzlers FROM transportation WHERE mileage  < 20;  
  8.   ...  

(5) Use RETURNING clause for batch operations
             
With the returning clause, we can easily determine the results of the DML operation we just completed without additional query work.
For an example, see the third small point of the use of BULK COLLECT
             
             
2. Accelerating DML through FORALL
          
FORALL tells the PL/SQL engine to bind all members of one or more collections to an SQL statement before sending the statement to the SQL engine.
          
(1) Grammar

To be continued,

To: http://blog.csdn.net/dba_waterbin/article/details/8581448

Posted by Denholm on Sat, 05 Jan 2019 15:06:09 -0800