简体   繁体   中英

Using JDBC to call a PL/SQL stored procedure with custom type input parameter, all fields are null

I'm using JDBC with createStruct() to call a stored procedure on an Oracle database that accepts a custom type as a parameter. The stored procedure inserts the custom type fields into a table and when I SELECT from the table later I see that all the fields that I tried to insert are NULL .

The custom type looks like this:

type record_rec as object (owner_id varchar2 (7),
                                        target_id VARCHAR2 (8),
                                        IP VARCHAR2 (15),
                                        PREFIX varchar2 (7),
                                        port varchar2 (4),
                                        description VARCHAR2 (35),
                                        cost_id varchar2(10))

The stored procedure looks like this:

package body            "PKG_RECORDS"
IS
procedure P_ADD_RECORD (p_target_id in out VARCHAR2,
                      p_record_rec in record_rec)
is
  l_target_id   targets.target_id%TYPE;
BEGIN
  Insert into targets (target_id,
                      owner_id,
                      IP,
                      description,
                      prefix,
                      start_date,
                      end_date,
                      cost_id,
                      port,
                      server_name,
                      server_code)
       values (f_sequence ('TARGETS'),
               p_record_rec.owner_id,
               p_record_rec.ip,
               p_record_rec.description,
               p_record_rec.prefix,
               sysdate,
               to_date ('01-JAN-2050'),
               p_record_rec.cost_id,
               p_record_rec.port,
               'test-server',
               '51')
    returning target_id
         into p_target_id;
 END;
END PKG_RECORDS;

My Java code looks something like this:

try (Connection con = m_dataSource.getConnection()) {
    ArrayList<String> ids = new ArrayList<>();
    CallableStatement call = con.prepareCall("{call PKG_RECORDS.P_ADD_RECORD(?,?)}");
    for (Record r : records) {
        call.registerOutParameter("p_target_id", Types.VARCHAR);
        call.setObject("p_record_rec", 
                con.createStruct("SCHEME_ADM.RECORD_REC", new Object[] {
                        r.getTarget_id(),
                        null, // will be populated by SP
                        t.getIp(),
                        t.getPrefix(),
                        t.getPort(),
                        t.getDescription(),
                        t.getCost_id()
                }), Types.STRUCT);
        call.execute();
        ids.add(call.getString("p_target_id"));
    }

    return new QueryRunner().query(con, 
            "SELECT * from TARGETS_V WHERE TARGET_ID IN ("+
                    ids.stream().map(s -> "?").collect(Collectors.joining(",")) +
                    ")",
            new BeanListHandler<Record>(Record.class),
            ids.toArray(new Object[] {})
            ).stream()
            .collect(Collectors.toList());
} catch (SQLException e) {
    throw new DataAccessException(e.getMessage());
}

Notes: * That last part is using Apache Commons db-utils - I love their bean stream operations. * The connection is using C3P0 connection pool - could that be related? * Just to make it clear - its not that the bean processor populates null values into the Record bean fields - if I use an SQL explorer to load the table (or view) directly, I can see that the fields in the database are indeed set to NULL .

There are no SQLException s when the process runs, or any other notice that something is wrong.

Any ideas what to check?

[Update] After reading on Oracle Objects and SQLData mappings, I rewrote the code to use SQLData .

The Record class now implements SQLData and it's writeSQL() method looks like this:

@Override
public void writeSQL(SQLOutput stream) throws SQLException {
    stream.writeString(owner_id);
    stream.writeString(target_id);
    stream.writeString(Objects.isNull(ip) ? "0" : ip); // weird, but as specified
    stream.writeString(prefix);
    stream.writeString(String.valueOf(port));
    stream.writeString(description);
    stream.writeString(cost_id);
}

Then at the start of the calling code, I've added:

con.getTypeMap().put("SCHEME_ADM.RECORD_REC", Record.class);

And instead of using createStruct() , the setObject() call now looks simply like this:

call.setObject("p_record_rec", t, Types.STRUCT)

But the result is the same - no errors and all the passed values are read as NULL . I've traced through the writeSQL() implementation and I can see that it is called and all values are passed correctly into the Oracle code. I've tried to use Types.JAVA_OBJECT in the setObject() call, and got an error: Invalid column type .

[Update 2] Bordering on insane helplessness I've implemented the OracleData pattern :

public class Record implements SQLData, OracleData, OracleDataFactory {
...
@Override
public Object toJDBCObject(Connection conn) throws SQLException {
    return conn.createStruct(getSQLTypeName(), new Object[] {
            Objects.isNull(owner_id) ? "" : owner_id,
            Objects.isNull(record_id) ? "" : record_id,
            Objects.isNull(ip) ? "0" : ip,
            Objects.isNull(prefix) ? "" : prefix,
            String.valueOf(port),
            Objects.isNull(description) ? "" : description,
            Objects.isNull(cost_id) ? "" : cost_id
    });
}

@Override
public OracleData create(Object jdbcValue, int sqltype) throws SQLException {
    if (Objects.isNull(jdbcValue)) return null;
    LinkedList<Object> attr = new LinkedList<>(Arrays.asList(((OracleStruct)jdbcValue).getAttributes()));
    Record r = new Record();
    r.setOwner_id(attr.removeFirst().toString());
    r.setRecord_id(attr.removeFirst().toString());
    r.setIp(attr.removeFirst().toString());
    r.setPrefix(attr.removeFirst().toString());
    r.setPort(Integer.parseInt(attr.removeFirst().toString()));
    r.setDescription(attr.removeFirst().toString());
    r.setCost_id(attr.removeFirst().toString());
    return r;
}

public static OracleDataFactory getOracleDataFactory() {
    return new Record();
}

Calling code:

...
// unwrap the Oracle object from C3P0 (standard JDBCv4 API)
OracleCallableStatement ops = call.unwrap(OracleCallableStatement.class);
// I'm not sure why I even need to do this - it looks exactly like
// the standard JDBC code
for (Records r : records) {
    ops.registerOutParameter(1, Types.VARCHAR);
    ops.setObject(2, t);
    ops.execute();
    ids.add(ops.getString(1));
}
...

And again, same result - no errors, a record is created in the table, with all provided values are null. I've traced through the code and the toJDBCObject() method is called correctly and does pass the values correctly in to createStruct() .

Found the problem. Annoyingly, its about character encoding.

If in the toJDBCObject() implementation, I run getAttributes() on the created struct, the resulting Object[] array has all fields set as "???" . Which is weird and looks like a character set transcoding failure (although it looks weird for that too - has three question marks for all fields regardless of value length, including empty string values).

According to Oracle's JDBC developer guide, "Globalization Support" :

The basic Java Archive (JAR) file ojdbc7.jar, contains all the necessary classes to provide complete globalization support for:

  • Oracle character sets for CHAR, VARCHAR, LONGVARCHAR, or CLOB data that is not being retrieved or inserted as a data member of an Oracle object or collection type.

  • CHAR or VARCHAR data members of object and collection for the character sets US7ASCII, WE8DEC, WE8ISO8859P1, WE8MSWIN1252, and UTF8.

To use any other character sets in CHAR or VARCHAR data members of objects or collections, you must include orai18n.jar in the CLASSPATH environment variable:

ORACLE_HOME/jlib/orai18n.jar

And my setup was using the character set " WE8ISO8859P9 " (I have no idea why, what it means, or even if it is selected by the client or the server - I just dumped the STRUCT object created by the OracleData API implementation and it was there somewhere).

So when Oracle says that it does not " provide complete globalization support ", they mean " all character fields will be silently converted to NULL ". Hmpph.

Anyway, adding orai18n.jar to the CLASSPATH indeed fixed the problem, and now records are added correctly to the database.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM