Oracle ORA-04030 even when using Bind Variables in a LOOP - sql

I have to delete almost 500 million rows from a remote table using PL/SQL. Since the UNDO tablespace cannot handle the volume, the deletes are being done in batches size of 1,000,000 and committed. Also - to reduce hard-parsing I am using bind variables using following syntax:
str := 'delete from table#link where id >= :x and id < :y';
execute immediate str using start_id, start_id+1000000
And after every invocation, the start_id is incremented by 1000000 till SQL%rowcount returns 0 ( zero ) and end_id ( which is known ) is reached.
But the process is getting ORA-0430 as follows:
ORA-04030: out of process memory when trying to allocate 16408 bytes (QERHJ hash-joi,QERHJ Bit vector)
ORA-04030: out of process memory when trying to allocate 41888 bytes (kxs-heap-c,temporary memory)
Note that I am already using bind variables so that there is no hard parsing after first execution.
One thing could be the range of ID at target. Assuming first few rows are in increasing order, the IDs are
100,000,000,000
200,000,000,000
50,000,000,000,000,000
50,000,000,000,011,111
On second iteration, IDs from 200,000,000,000 to 200,000,100,000 will be deleted.
But since there are no IDs in this range, it will take almost 50,000,000,000 iterations to get to next row ( 50,000,000,000,000,000 / 1000000 = 50,000,000,000 ).
Of course - I can always examine the ID from target and choose correct range ( that is much larger from default 1 million ).
But that should not be the case for process to run out of memory.
Added Code:
remote.sql : execute on remote :
create table test1
(
id number(38) primary key
);
insert into test1 select level from dual connect by level < 1000000;
insert into test1 values ( 1000000000000 );
insert into test1 values ( 2000000000000 );
commit;
exec dbms_stats.gather_table_stats ( ownname => user, tabname => 'test1',
cascade => true, estimate_percent => 100 );
commit;
local.sql :
create or replace procedure batch_del
as
l_min_val integer;
l_max_val integer;
l_cnt integer;
l_cnt_dst integer;
l_begin integer;
l_end integer;
l_str varchar2(1000);
l_tot_cnt integer;
pragma autonomous_transaction;
begin
l_tot_cnt := 0;
l_str := ' select min(id), max(id), count(*) from test1#dst';
execute immediate l_str into l_min_val, l_max_val, l_cnt_dst;
dbms_output.put_line ( 'min: ' || l_min_val || ' max: ' || l_max_val
|| ' total : ' || l_cnt_dst );
l_begin := l_min_val;
while l_begin < l_max_val
loop
begin
l_end := l_begin + 100000;
delete from test1#dst where id >= l_begin and id < l_end;
l_cnt := SQL%ROWCOUNT;
dbms_output.put_line ( 'Rows Processed : ' || l_cnt );
l_tot_cnt := l_tot_cnt + l_cnt;
dbms_output.put_line ( 'Rows Processed So Far : ' || l_tot_cnt );
commit;
exception
when others then
dbms_output.put_line ( 'Error : ' || sqlcode );
end;
l_begin := l_begin + 100000;
end loop;
dbms_output.put_line ( 'Total : ' || l_tot_cnt );
end;
**All Local Implementation **
drop table test1;
create table test1
(
id number(38) primary key
);
insert into test1 select level from dual connect by level < 1000000;
insert into test1 values ( 1000000000000 );
insert into test1 values ( 2000000000000 );
commit;
exec dbms_stats.gather_table_stats ( ownname => user, tabname => 'test1',
cascade => true, estimate_percent => 100 );
commit;
create or replace procedure batch_del
as
l_min_val integer;
l_max_val integer;
l_cnt integer;
l_begin integer;
l_tot_cnt integer;
pragma autonomous_transaction;
begin
l_tot_cnt := 0;
select min(id), max(id) into l_min_val, l_max_val from test1;
l_begin := l_min_val;
while l_begin < l_max_val
loop
begin
delete from test1 where id >= l_begin and id < l_begin + 10000;
l_cnt := SQL%ROWCOUNT;
dbms_output.put_line ( 'Rows Processed : ' || l_cnt );
l_tot_cnt := l_tot_cnt + l_cnt;
dbms_output.put_line ( 'Rows Processed So Far : ' || l_tot_cnt );
commit;
exception
when others then
dbms_output.put_line ( 'Error : ' || sqlcode );
end;
l_begin := l_begin + 10000;
end loop;
dbms_output.put_line ( 'Total : ' || l_tot_cnt );
end;
set timing on;
set serveroutput on size unli;
exec batch_del;

You are using DMS_Output in your procedure:
dbms_output.put_line ( 'Rows Processed : ' || l_cnt );
....
dbms_output.put_line ( 'Rows Processed So Far : ' || l_tot_cnt );
Each of the above calls produces a string roughly 25 characters long (~25 bytes).
The PUT_LINE prodcedure does not print a message "online" on the console, but rather it places all messages into a memory buffer, please see a note in the documentation: DBMS_OUTPUT
Note: Messages sent using DBMS_OUTPUT are not actually sent until the
sending subprogram or trigger completes. There is no mechanism to
flush output during the execution of a procedure.
....
....
Rules and Limits
The maximum line size is 32767 bytes.
The default buffer size is 20000 bytes. The minimum size is 2000 bytes and the maximum is unlimited.
You wrote in the question:
it will take almost 50,000,000,000 iterations
It is very easy to estimate a memory size required for storing DBMS_Output messages, just: 2 messagess, each 25 bytes, 50,000,000,000 iterations
2 * 25 * 50,000,000,000 = 2 500 000 000 000 bytes
It seems you need about 2 500 Gigabytes (~2,5 terabytes) memory to store all messaged from your procedure. PGA_AGGREGATE_TARGET = 1.5 gB is definitely too low.
Just remove DBMS_Output from your code, no one (any human being) is able to read 50~100 billion mesagess from the console.
If you want to monitor the procedure, use DBMS_APPLICATION_INFO.SET_CLIENT_INFO Procedure, you can store a message up to 64 characters, an then query against V$SESSION view to retrieve last message.

This is not an answer, but it's too large to fit in a comment, so here we are!
There's no need for the dynamic sql. If I were you, I'd rewrite this as:
create or replace procedure batch_del as
l_min_val integer;
l_max_val integer;
l_begin integer;
l_end integer;
l_rows_to_process number := 100000;
pragma autonomous_transaction;
begin
select min(id),
max(id),
count(*)
into l_min_val,
l_max_val,
l_cnt_dst
from test1#dst;
l_begin := l_min_val;
while l_begin < l_max_val
loop
begin
l_end := l_begin + l_rows_to_process;
delete from test1#dst
where id >= l_begin
and id < l_end;
dbms_output.put_line('rows deleted: '||sql%rowcount);
commit;
exception
when others then
dbms_output.put_line('error : ' || sqlcode);
end;
l_begin := l_begin + l_rows_to_process;
end loop;
end;
/
Alternatively, if you've got non-consecutive id's, perhaps this would be more performant for you:
create or replace procedure batch_del as
type type_id_array is table of number index by pls_integer;
l_min_id_array type_id_array;
l_max_id_array type_id_array;
l_rows_to_process number := 10000;
pragma autonomous_transaction;
begin
select min(id) min_id,
max(id) max_id bulk collect
into l_min_id_array,
l_max_id_array
from (select --/*+ driving_site(t1) */
id,
ceil((row_number() over(order by id)) / l_rows_to_process) grp
from test1 t1)
group by grp
order by grp;
for i in l_min_id_array.first..l_min_id_array.last
loop
begin
delete from test1
where id between l_min_id_array(i) and l_max_id_array(i);
dbms_output.put_line('rows deleted in loop '||i||': '||sql%rowcount);
commit;
exception
when others then
-- i hope there is some better way of logging an error in your
-- production db; e.g. a separate procedure writing to a log table.
dbms_output.put_line('error in loop '||i||': ' || sqlcode);
end;
end loop;
end batch_del;
/

To make #boneist answer more flexible, one can use EXECUTE IMMEDIATE as follows:
loop
.....
str := 'select min(id) min_id, max(id) max_id l_min_id_array,
l_max_id_array from (select id, ceil((row_number() over(order by
id)) / l_rows_to_process) grp from test1 t1) group by grp order
by grp';
execute immediate str bulk collect into l_min_id_array, l_max_id_array;
....
end loop;

Related

Approximate estimate of size of table in Oracle [duplicate]

Is this possible? Or at least I'm looking to have a list of the size of all rows in a table.
select vsize(col1) + vsize(col2) + vsize(col3) +
long_raw_length_function(long_col) + DBMS_LOB.GETLENGTH(blob_col)
from table
where id_col = id_val;
for the long_raw_length_function, see this Get the LENGTH of a LONG RAW
if you're interested in the average row length, you could analyze the table (with the DBMS_STATS package), then query ALL_TABLES.avg_row_len.
Here below is the query I have modified to get the table row length when you don't have any data. This can help you with Capacity planning for Environment setup:
SET serveroutput ON linesize 300
DECLARE
v_max_size NUMBER := 0;
v_owner VARCHAR2(30);
v_table_name VARCHAR2(30);
v_data_type VARCHAR2(30);
v_data_length NUMBER := 0;
v_data_precision NUMBER := 0;
CURSOR CUR_TABLE
IS
SELECT DISTINCT table_name
FROM all_tab_columns
WHERE owner='TMS_OWNER'
AND table_name NOT LIKE 'VIEW%'
ORDER BY table_name;
BEGIN
FOR Tab IN CUR_TABLE
LOOP
v_table_name := Tab.table_name;
v_max_size := 0;
FOR i IN
(SELECT owner,
table_name,
data_type,
data_length,
data_precision
FROM all_tab_columns
WHERE owner ='TMS_OWNER'
AND table_name = v_table_name
)
LOOP
IF i.data_type = 'NUMBER' THEN
v_max_size := (v_max_size + i.data_precision);
ELSE
v_max_size := (v_max_size + i.data_length);
END IF;
END LOOP;
dbms_output.put_line(chr(10));
dbms_output.put_line('Table ='||v_table_name||', Max Record Size = '||v_max_size||' bytes');
END LOOP;
END;
/

Dynamic sql with for loop PL/SQL

The following query needs to convert to dynamic SQL without hard code cursor SQL,
using l_query, I do not know the l_query it will come as a parameter.
Inside the loop, I need to execute another insert query ( l_insert_query) that also comes as a parameter.
Your counsel would be much appreciated
DECLARE
CURSOR cust
IS
SELECT *
FROM customer
WHERE id < 500;
BEGIN
l_query := 'SELECT * FROM customer WHERE id < 5';
l_insert_query :=
'insert into data ( name, mobile) values ( cust.name,cust.mobile)';
FOR r_cust IN cust
LOOP
EXECUTE IMMEDIATE l_insert_query;
END LOOP;
END;
You could do this with a dynamic PL/SQL block:
declare
l_query varchar2(100) := 'SELECT * FROM customer WHERE id < 5';
l_insert varchar2(100) := 'insert into data ( name, mobile) values ( cust.name,cust.mobile)';
l_plsql varchar2(4000);
begin
l_plsql := '
begin
for cust in (' || l_query || ') loop
' || l_insert || ';
end loop;
end;
';
dbms_output.put_line(l_plsql);
execute immediate l_plsql;
end;
/
The l_plsql statement ends up as a generated PL/SQL block using the cursor query and insert statement:
begin
for cust in (SELECT * FROM customer WHERE id < 5) loop
insert into data ( name, mobile) values ( cust.name,cust.mobile);
end loop;
end;
db<>fiddle
But that you can do this doesn't mean you should. This is vulnerable to SQL injection, and doesn't seem like a very safe, sensible or efficient way to handle data manipulation in your system.

Oracle PL/SQL How to store and fetch a dynamic multi column query

I am trying hard dynamic PL/SQL thing here.
I don't manage to fetch a column dynamic Query.
I am iterating on the name of the column to concatenate a full query in order to be executed on another table.
sql_req := 'select ';
for c in (SELECT name_col from TAB_LISTCOL)
loop
sql_req := sql_req || 'sum(' || c.name_col || '),';
end loop;
sql_req := sql_req || ' from ANOTHER_TAB ';
And when i try to execute it with EXECUTE IMMEDIATE or cursors or INTO/BULK COLLECT thing or just to fetch, i don't manage to iterate on the result.
I tried a lot.
Can you help me plz ? Or maybe it is not possible ?
ps : i know the coma is wrong but my code is more complexe than this : i didn't want to put more things
If you only want to get string columns, you can use listagg
select listagg(name_col, ',') WITHIN GROUP (ORDER BY null) from TAB_LISTCOL
Please see if this helps
In the absence of actual table structure and requirement, I'm creating dummy tables and query to illustrate an example:
SQL> create table another_tab
as
select 10 dummy_value1, 100 dummy_value2, 1000 dummy_value3 from dual union all
select 11 dummy_value1, 101 dummy_value2, 1001 dummy_value3 from dual union all
select 12 dummy_value1, 102 dummy_value2, 1003 dummy_value3 from dual
;
Table created.
SQL> create table tab_listcol
as select column_name from dba_tab_cols where table_name = 'ANOTHER_TAB'
;
Table created.
To reduce complexity in the final block, I'm defining a function to generate the dynamic sql query. This is based on your example and will need changes according to your actual requirement.
SQL> create or replace function gen_col_based_query
return varchar2
as
l_query varchar2(4000);
begin
l_query := 'select ';
for cols in ( select column_name cname from tab_listcol )
loop
l_query := l_query || 'sum(' || cols.cname || '), ' ;
end loop;
l_query := rtrim(l_query,', ') || ' from another_tab';
return l_query;
end;
/
Function created.
Sample output from the function will be as follows
SQL> select gen_col_based_query as query from dual;
QUERY
--------------------------------------------------------------------------------
select sum(DUMMY_VALUE1), sum(DUMMY_VALUE2), sum(DUMMY_VALUE3) from another_tab
Below is a sample block for executing a dynamic cursor using DBMS_SQL. For your ease of understanding, I've added comments wherever possible. More info here.
SQL> set serveroutput on size unlimited
SQL> declare
sql_stmt clob;
src_cur sys_refcursor;
curid number;
desctab dbms_sql.desc_tab; -- collection type
colcnt number;
namevar varchar2 (50);
numvar number;
datevar date;
l_header varchar2 (4000);
l_out_rows varchar2 (4000);
begin
/* Generate dynamic sql from the function defined earlier */
select gen_col_based_query into sql_stmt from dual;
/* Open cursor variable for this dynamic sql */
open src_cur for sql_stmt;
/* To fetch the data, however, you cannot use the cursor variable, since the number of elements fetched is unknown at complile time.
Therefore you use DBMS_SQL.TO_CURSOR_NUMBER to convert a REF CURSOR variable to a SQL cursor number which you can then pass to DBMS_SQL subprograms
*/
curid := dbms_sql.to_cursor_number (src_cur);
/* Use DBMS_SQL.DESCRIBE_COLUMNS to describe columns of your dynamic cursor, returning information about each column in an associative array of records viz., desctab. The no. of columns is returned in colcnt variable.
*/
dbms_sql.describe_columns (curid, colcnt, desctab);
/* Define columns at runtime based on the data type (number, date or varchar2 - you may add to the list)
*/
for indx in 1 .. colcnt
loop
if desctab (indx).col_type = 2 -- number data type
then
dbms_sql.define_column (curid, indx, numvar);
elsif desctab (indx).col_type = 12 -- date data type
then
dbms_sql.define_column (curid, indx, datevar);
else -- assuming string
dbms_sql.define_column (curid, indx, namevar, 100);
end if;
end loop;
/* Print header row */
for i in 1 .. desctab.count loop
l_header := l_header || ' | ' || rpad(desctab(i).col_name,20);
end loop;
l_header := l_header || ' | ' ;
dbms_output.put_line(l_header);
/* Loop to retrieve each row of data identified by the dynamic cursor and print output rows
*/
while dbms_sql.fetch_rows (curid) > 0
loop
for indx in 1 .. colcnt
loop
if (desctab (indx).col_type = 2) -- number data type
then
dbms_sql.column_value (curid, indx, numvar);
l_out_rows := l_out_rows || ' | ' || rpad(numvar,20);
elsif (desctab (indx).col_type = 12) -- date data type
then
dbms_sql.column_value (curid, indx, datevar);
l_out_rows := l_out_rows || ' | ' || rpad(datevar,20);
elsif (desctab (indx).col_type = 1) -- varchar2 data type
then
dbms_sql.column_value (curid, indx, namevar);
l_out_rows := l_out_rows || ' | ' || rpad(namevar,20);
end if;
end loop;
l_out_rows := l_out_rows || ' | ' ;
dbms_output.put_line(l_out_rows);
end loop;
dbms_sql.close_cursor (curid);
end;
/
PL/SQL procedure successfully completed.
Output
| SUM(DUMMY_VALUE1) | SUM(DUMMY_VALUE2) | SUM(DUMMY_VALUE3) |
| 33 | 303 | 3004 |
You have to use EXECUTE IMMEDIATE with BULK COLLECT
Below is an example of the same. For more information refer this link
DECLARE
TYPE name_salary_rt IS RECORD (
name VARCHAR2 (1000),
salary NUMBER
);
TYPE name_salary_aat IS TABLE OF name_salary_rt
INDEX BY PLS_INTEGER;
l_employees name_salary_aat;
BEGIN
EXECUTE IMMEDIATE
q'[select first_name || ' ' || last_name, salary
from hr.employees
order by salary desc]'
BULK COLLECT INTO l_employees;
FOR indx IN 1 .. l_employees.COUNT
LOOP
DBMS_OUTPUT.put_line (l_employees (indx).name);
END LOOP;
END;
If I understand correctly, you want to create a query and execute it and return the result to another function or some calling app. As the resulting query's columns are note before-known, I'd return a ref cursor in this case:
create function get_sums return sys_refcur as
declare
my_cursor sys_refcursor;
v_query varchar2(32757);
begin
select
'select ' ||
listagg('sum(' || name_col || ')', ', ') within group (order by name_col) ||
' from another_tab'
into v_query
from tab_listcol;
open my_cursor for v_query;
return v_query;
end get_sums;

character string buffer too small ORA-06502

I am having a problem while concatenating the varchar2 datatype in a cursor loop.
Procedure is iterating in a loop to build the in clause for insert and delete operations in batch.The process will run in batch for every 1000 account numbers.
For small amount of records it works but when it tries to concatenate large amount of records(36451477 in temp table) in a loop it throws.
java.sql.SQLException: ORA-06502: PL/SQL: numeric or value error:
character string buffer too small ORA-06512: at
"QA01BT.LOAD_ITEM_DATA_TO_CONSOLIDATE", line 23 ORA-06512: at line 1
i have put a max limit of search id to 32767 but still it does not work.
is there any other way to achieve this?
create or replace PROCEDURE LOAD_ITEM_DATA_TO_CONSOLIDATE(updatecount OUT NUMBER
)
IS
cnt NUMBER := 0;
c_limit CONSTANT PLS_INTEGER DEFAULT 1000;
search_id varchar2(32727);
TYPE account_array
IS TABLE OF VARCHAR2(255) INDEX BY BINARY_INTEGER;
l_data ACCOUNT_ARRAY;
CURSOR account_cursor IS
SELECT DISTINCT account_no AS account_num
FROM item_temp;
BEGIN
OPEN account_cursor;
LOOP
FETCH account_cursor bulk collect INTO l_data limit c_limit;
search_id := '''';
FOR i IN 1 .. l_data.count LOOP
IF( i != 1 ) THEN
search_id := search_id
|| ','
|| ''''
|| l_data(i)
|| '''';
ELSE
search_id := search_id
|| l_data(i)
|| '''';
END IF;
END LOOP;
BEGIN
SAVEPOINT move_data_to_temp_table;
EXECUTE IMMEDIATE 'delete from item where ACCOUNT_NO IN('||search_id||')';
EXECUTE IMMEDIATE 'insert into item(ID,ACCOUNT_NO,ITEM_ID,ITEM_VALUE) select HIBERNATE_SEQUENCE.nextval,temp.ACCOUNT_NO,temp.ITEM_ID,temp.ITEM_VALUE from item_TEMP temp where ACCOUNT_NO IN('||search_id||')';
cnt := cnt + SQL%rowcount;
COMMIT;
EXCEPTION WHEN OTHERS THEN ROLLBACK to move_data_to_temp_table;
END;
EXIT WHEN account_cursor%NOTFOUND;
END LOOP;
updatecount := cnt;
CLOSE account_cursor;
END LOAD_ITEM_DATA_TO_CONSOLIDATE;
This seems somewhat over-engineered. Why not just this?
create or replace PROCEDURE LOAD_ITEM_DATA_TO_CONSOLIDATE
(updatecount OUT NUMBER)
IS
BEGIN
delete from item
where ACCOUNT_NO IN ( SELECT account_no
FROM item_temp);
insert into item(ID,ACCOUNT_NO,ITEM_ID,ITEM_VALUE)
select HIBERNATE_SEQUENCE.nextval, temp.ACCOUNT_NO, temp.ITEM_ID, temp.ITEM_VALUE
from item_TEMP temp ;
updatecount := SQL%rowcount;
END LOAD_ITEM_DATA_TO_CONSOLIDATE;
If you do decide you need to do this in batches and are worried about that string getting too long or having too many elements in the list (max is 1000), you should try putting your values into an array and then using IN against the array, via a table function or a direct reference to the table.
Extra bonus: no need for dynamic SQL!
Something like this:
CREATE OR REPLACE TYPE strings_t IS TABLE OF VARCHAR2 (255)
/
CREATE OR REPLACE PROCEDURE load_item_data_to_consolidate (
updatecount OUT NUMBER)
IS
cnt NUMBER := 0;
c_limit CONSTANT PLS_INTEGER DEFAULT 1000;
l_data strings_t;
CURSOR account_cursor
IS
SELECT DISTINCT account_no AS account_num FROM item_temp;
BEGIN
OPEN account_cursor;
LOOP
FETCH account_cursor BULK COLLECT INTO l_data LIMIT c_limit;
BEGIN
SAVEPOINT move_data_to_temp_table;
DELETE FROM item
WHERE account_no IN (SELECT COLUMN_VALUE FROM TABLE (l_data));
INSERT INTO item (id,
account_no,
item_id,
item_value)
SELECT hibernate_sequence.NEXTVAL,
temp.account_no,
temp.item_id,
temp.item_value
FROM item_temp temp
WHERE account_no IN (SELECT COLUMN_VALUE FROM TABLE (l_data));
cnt := cnt + SQL%ROWCOUNT;
COMMIT;
EXCEPTION
WHEN OTHERS
THEN
ROLLBACK TO move_data_to_temp_table;
END;
EXIT WHEN account_cursor%NOTFOUND;
END LOOP;
END;

Any alternatives to using cursor in SQL procedure in Oracle 10g?

I give the SQL few inputs and I need to get all the ID's and their count that doesn't satisfy the required criteria.
I would like to know if there are there any alternatives to using cursor.
DECLARE
v_count INTEGER;
v_output VARCHAR2 (1000);
pc table1%ROWTYPE;
unmarked_ids EXCEPTION;
dynamic_sql VARCHAR (5000);
cur SYS_REFCURSOR;
id pp.id%TYPE;
pos INTEGER;
BEGIN
v_count := 0;
SELECT *
INTO pc
FROM table1
WHERE id = '&ID';
DBMS_OUTPUT.ENABLE;
dynamic_sql :=
'SELECT ID from pp
WHERE ( TO_CHAR(cdate, ''yyyy/mm/dd'') =
TO_CHAR (:a, ''yyyy/mm/dd''))
AND aid IN (SELECT aid FROM ppd WHERE TO_CHAR(cdate, ''yyyy/mm/dd'') =
TO_CHAR (:b, ''yyyy/mm/dd'')
AND cid = :c )
AND cid <> :d';
OPEN cur FOR dynamic_sql USING pc.cdate, pc.cdate, pc.id, pc.id;
LOOP
FETCH cur INTO id;
EXIT WHEN cur%NOTFOUND;
v_count := v_count + 1;
DBMS_OUTPUT.PUT_LINE (' Id:' || id);
END LOOP;
CLOSE cur;
IF (v_count > 0)
THEN
DBMS_OUTPUT.PUT_LINE ( 'Count: ' || v_count || ' SQL: ' || dynamic_sql);
RAISE unmarked_ids;
END IF;
DBMS_OUTPUT.PUT_LINE('SQL ended successfully');
EXCEPTION
WHEN unmarked_ids
THEN
DBMS_OUTPUT.put_line (
'Found ID's that not marked with the current id.');
WHEN NO_DATA_FOUND
THEN
DBMS_OUTPUT.put_line (
'No data found in table1 with the current id ' || '&ID');
END;
There are bind variables in the query. One of them is date, there are three more.
The count and ID's are required to be shown which will later be reported.
You could store the rowid in a temporary table along with an index value (0...n) and then use a while loop to go through the index values and join to the real table using the rowid.