SG-1924 Data Lake - Upload Utility causing An error occurred getting the Tenant Data from table R5OBJECTS. Check Tenant connection information. Exception Message The first batch of 700 rows is larger than the payload Size 5242880 - HxGN EAM - Version 12.0 - Hexagon

HxGN EAM Resolved Issues for 2022

Language
English
Product
HxGN EAM
Search by Category
HxGN EAM Version
12

SG-1924 Data Lake - Upload Utility causing An error occurred getting the Tenant Data from table R5OBJECTS. Check Tenant connection information. Exception Message The first batch of 700 rows is larger than the payload Size 5242880

 Description 

When R5STRUCTURES records are uploaded with Upload Utility, we can with this standard way create a situation in EAM where more than 600 R5OBJECTS get OBJ_PARENT populated with a value and OBJECT_UPDATED get the same date/time stamp, and this causes the error

‘An error occurred getting the Tenant Data from table R5OBJECTS. Check Tenant connection information. Exception Message The first batch of 700 rows is larger than the payload Size 5242880’

which will require the data lake flattening scripts.

There was a discussion that the error would be only reproduced with custom scripts, but also with standard upload utility for a small table like R5STRUCTURES, this can be reproduced, so a solution would need to be found for the Datalake Payload issue.

Steps to reproduce:

# Datalake - Add schedule for R5OBJECTS.

# Positions screen (or Upload Utility) - Create 1000 position records.

# Upload utility - Upload Structures records for the above positions. This will create a R5STRUCTURES records, and populate OBJ_PARENT based on R5STRUCTURES and OBJ_UPDATED with the sysdate.

# Wait for Datalake schdule to run - getting error An error occurred getting the Tenant Data from table R5OBJECTS. Check Tenant connection information. Exception Message The first batch of 700 rows is larger than the payload Size 5242880’

Example upload template attached.

Actual result - Getting Payload error in Data lake.

Expected result - when using standard EAM tools, no datalake error should occur.

======================= Start Note to the QA ===========================

*This Jira can be closed with* [SG-2043|smart-link]*,.*

Prerequisite: Please test these in the MT cloud environment where DLU is installed.

# Create 2000 records of assets in r5objects table.

# .Login to the DB and execute the following statements:

a. Alter table r5objects disable all triggers

b. update r5objects set obj_lastsaved = to_date(<<Today's Date>>, 'YYYY-MM-DD HH:MI:SS') where ROWWNUM <=1000

Above date value can be anything.

c. commit;

d Alter table r5objects enable all triggers

# Go to Administration -> Datalake upload Setup

# Create a schedule with mandatory fields filled, set the frequency = 5mins.

# Go the tables tab and add r5objects, save it and reselect the record and set the record 'Active'

# Set the schedule to Active state and wait for the schedule to run.

# Go to OneView and see if the message is visible with all 400 records, with size more than 5MB transmitted.

Basic testing : When no data is transmitted the error message column in the tables tab should read:

*No data in the Table data result set. Possibly we are at the latest updated timestamp*

======================== End Note to the QA ===========================