Home » SQL & PL/SQL » SQL & PL/SQL » Inserted JSON into Column is being Truncated (Oracle, 19.0.0.0.0, Linux)
Inserted JSON into Column is being Truncated [message #689127] Wed, 27 September 2023 16:26 Go to next message
Duane
Messages: 557
Registered: December 2002
Senior Member
I have some sample JSON data that I want to insert into a table.  I grabbed some sample JSON off the internet for testing.  I used a JSON Validator to ensure it's valid JSON.

The problem I'm having is the JSON is being truncated at the end.  This data is being truncated (:0,"limit":30}).  Not sure why.

Does anyone see what I'm doing wrong?


CREATE TABLE TRANSACTIONS_TEST
(
  ORDER_ID        NUMBER,
  ORDER_DOCUMENT  BLOB
)


declare
  
  blob_content blob;
  clob_content clob := '{"comments":[{"id":1,"body":"This is some awesome thinking!","postId":100,"user":{"id":63,"username":"eburras1q"}},{"id":2,"body":"What terrific math skills you’re showing!","postId":27,"user":{"id":71,"username":"omarsland1y"}},{"id":3,"body":"You are an amazing writer!","postId":61,"user":{"id":29,"username":"jissetts"}},{"id":4,"body":"Wow! You have improved so much!","postId":8,"user":{"id":19,"username":"bleveragei"}},{"id":5,"body":"Nice idea!","postId":62,"user":{"id":70,"username":"cmasurel1x"}},{"id":6,"body":"You are showing excellent understanding!","postId":19,"user":{"id":97,"username":"cdavydochkin2o"}},{"id":7,"body":"This is clear, concise, and complete!","postId":47,"user":{"id":22,"username":"froachel"}},{"id":8,"body":"What a powerful argument!","postId":47,"user":{"id":82,"username":"kogilvy29"}},{"id":9,"body":"I knew you could do it!","postId":64,"user":{"id":31,"username":"smargiottau"}},{"id":10,"body":"Wonderful ideas!","postId":4,"user":{"id":35,"username":"mbrooksbanky"}},{"id":11,"body":"It was a pleasure to grade this!","postId":2,"user":{"id":68,"username":"rstrettle1v"}},{"id":12,"body":"Keep up the incredible work!","postId":50,"user":{"id":77,"username":"rkingswood24"}},{"id":13,"body":"My goodness, how impressive!","postId":37,"user":{"id":28,"username":"xisherwoodr"}},{"id":14,"body":"You’re showing inventive ideas!","postId":30,"user":{"id":57,"username":"bpickering1k"}},{"id":15,"body":"You’ve shown so much growth!","postId":44,"user":{"id":76,"username":"cgaber23"}},{"id":16,"body":"Interesting thoughts!","postId":71,"user":{"id":100,"username":"pcumbes2r"}},{"id":17,"body":"I love your neat work!","postId":68,"user":{"id":37,"username":"nwytchard10"}},{"id":18,"body":"Doesn’t it feel good to do such great work?","postId":41,"user":{"id":31,"username":"smargiottau"}},{"id":19,"body":"First-rate work!","postId":75,"user":{"id":60,"username":"dlambarth1n"}},{"id":20,"body":"This is fascinating information!","postId":48,"user":{"id":17,"username":"vcholdcroftg"}},{"id":21,"body":"You inspire me!","postId":29,"user":{"id":5,"username":"kmeus4"}},{"id":22,"body":"This is right on target!","postId":18,"user":{"id":31,"username":"smargiottau"}},{"id":23,"body":"What an astounding observation!","postId":73,"user":{"id":14,"username":"mturleyd"}},{"id":24,"body":"This is very well thought out!","postId":32,"user":{"id":16,"username":"dpierrof"}},{"id":25,"body":"I can tell you’ve been practicing!","postId":44,"user":{"id":78,"username":"dbuist25"}},{"id":26,"body":"You’ve come a long way!","postId":70,"user":{"id":82,"username":"kogilvy29"}},{"id":27,"body":"I can tell you’ve been paying attention!","postId":60,"user":{"id":74,"username":"ahinckes21"}},{"id":28,"body":"Reading this made my day!","postId":85,"user":{"id":85,"username":"kpondjones2c"}},{"id":29,"body":"This is very perceptive!","postId":13,"user":{"id":30,"username":"kdulyt"}},{"id":30,"body":"What an accomplishment!","postId":23,"user":{"id":36,"username":"dalmondz"}}],"total":340,"skip":0,"limit":30}';

  
  begin
    insert into transactions_test
      (order_id,
       order_document)
       
     values
       (1000,
        empty_blob()) returning order_document into blob_content;
        
    dbms_lob.write(blob_content, length(clob_content), 1, utl_raw.cast_to_raw(clob_content));
       
  end;

This seems simple enough but I must be missing something.

[Updated on: Wed, 27 September 2023 19:05]

Report message to a moderator

Re: Inserted JSON into Column is being Truncated [message #689131 is a reply to message #689127] Thu, 28 September 2023 00:57 Go to previous messageGo to next message
Michel Cadot
Messages: 68647
Registered: March 2007
Location: Nanterre, France, http://...
Senior Member
Account Moderator

UTL_RAW works on VACHAR2 and RAW datatypes bot CLOB and BLOB so you are limited to 32K.
You can, for instance, split the CLOB into shorter chunks.

Re: Inserted JSON into Column is being Truncated [message #689133 is a reply to message #689127] Thu, 28 September 2023 01:00 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9090
Registered: November 2002
Location: California, USA
Senior Member
If I try to run your code, I get a SQL*Plus error message on the line that begins "clob_content clob :=".  

If I run it on 12c it says, "SP2-0027: Input is too long (> 2499 characters) - line ignored."  

If I run it on 19c (EDIT: CORRECTION: 21C), it says, "SP2-0341:
line overflow during variable substitution (>3000 characters at line 4).

So, it seems it won't accept anything that big that way.  You could break it up into multiple smaller strings and use dbms_lob.writeappend.  However, if the goal is to load some copied and pasted data into a blob column, then I think it might be simpler to copy and paste the data into a file, then load the data into the blob column from the file.  Please see the demonstration below.

-- I copied and pasted your data into c:\my_oracle_files\test.dat

-- create oracle directory object if you don't have one already:
C##SCOTT@XE_21.3.0.0.0> CREATE OR REPLACE DIRECTORY MY_DIR AS 'c:\my_oracle_files'
  2  /

Directory created.
-- create table:
C##SCOTT@XE_21.3.0.0.0> CREATE TABLE TRANSACTIONS_TEST
  2  (
  3    ORDER_ID        NUMBER,
  4    ORDER_DOCUMENT  BLOB
  5  )
  6  /

Table created.
-- insert data:
C##SCOTT@XE_21.3.0.0.0> declare
  2    blob_content  blob;
  3    v_src_bfile   bfile  := bfilename ('MY_DIR', 'test.dat');
  4    v_dest_offset number := 1;
  5    v_src_offset  number := 1;
  6  begin
  7    insert into transactions_test
  8  	 ( order_id,
  9  	   order_document
 10  	 )
 11  	 values
 12  	   ( 1000,
 13  	     empty_blob()
 14  	   )
 15  	 returning order_document into blob_content;
 16    dbms_lob.fileopen  ( v_src_bfile, dbms_lob.file_readonly );
 17    dbms_lob.loadblobfromfile
 18  	 ( dest_lob    => blob_content
 19  	 , src_bfile   => v_src_bfile
 20  	 , amount      => dbms_lob.lobmaxsize
 21  	 , dest_offset => v_dest_offset
 22  	 , src_offset  => v_src_offset
 23  	 );
 24    dbms_lob.fileclose ( v_src_bfile );
 25  end;
 26  /

PL/SQL procedure successfully completed.
-- check results:
C##SCOTT@XE_21.3.0.0.0> SELECT order_id,
  2  	    TO_CLOB (order_document) order_document
  3  FROM   transactions_test
  4  /

  ORDER_ID                                                                      
----------                                                                      
ORDER_DOCUMENT                                                                  
--------------------------------------------------------------------------------
      1000                                                                      
{"comments":[{"id":1,"body":"This is some awesome thinking!","postId":100,"user"
:{"id":63,"username":"eburras1q"}},{"id":2,"body":"What terrific math skills you
’re showing!","postId":27,"user":{"id":71,"username":"omarsland1y"}},{"id":3,"bo
dy":"You are an amazing writer!","postId":61,"user":{"id":29,"username":"jissett
s"}},{"id":4,"body":"Wow! You have improved so much!","postId":8,"user":{"id":19
,"username":"bleveragei"}},{"id":5,"body":"Nice idea!","postId":62,"user":{"id":
70,"username":"cmasurel1x"}},{"id":6,"body":"You are showing excellent understan
ding!","postId":19,"user":{"id":97,"username":"cdavydochkin2o"}},{"id":7,"body":
"This is clear, concise, and complete!","postId":47,"user":{"id":22,"username":"
froachel"}},{"id":8,"body":"What a powerful argument!","postId":47,"user":{"id":
82,"username":"kogilvy29"}},{"id":9,"body":"I knew you could do it!","postId":64
,"user":{"id":31,"username":"smargiottau"}},{"id":10,"body":"Wonderful ideas!","
postId":4,"user":{"id":35,"username":"mbrooksbanky"}},{"id":11,"body":"It was a 
pleasure to grade this!","postId":2,"user":{"id":68,"username":"rstrettle1v"}},{
"id":12,"body":"Keep up the incredible work!","postId":50,"user":{"id":77,"usern
ame":"rkingswood24"}},{"id":13,"body":"My goodness, how impressive!","postId":37
,"user":{"id":28,"username":"xisherwoodr"}},{"id":14,"body":"You’re showing inve
ntive ideas!","postId":30,"user":{"id":57,"username":"bpickering1k"}},{"id":15,"
body":"You’ve shown so much growth!","postId":44,"user":{"id":76,"username":"cga
ber23"}},{"id":16,"body":"Interesting thoughts!","postId":71,"user":{"id":100,"u
sername":"pcumbes2r"}},{"id":17,"body":"I love your neat work!","postId":68,"use
r":{"id":37,"username":"nwytchard10"}},{"id":18,"body":"Doesn’t it feel good to 
do such great work?","postId":41,"user":{"id":31,"username":"smargiottau"}},{"id
":19,"body":"First-rate work!","postId":75,"user":{"id":60,"username":"dlambarth
1n"}},{"id":20,"body":"This is fascinating information!","postId":48,"user":{"id
":17,"username":"vcholdcroftg"}},{"id":21,"body":"You inspire me!","postId":29,"
user":{"id":5,"username":"kmeus4"}},{"id":22,"body":"This is right on target!","
postId":18,"user":{"id":31,"username":"smargiottau"}},{"id":23,"body":"What an a
stounding observation!","postId":73,"user":{"id":14,"username":"mturleyd"}},{"id
":24,"body":"This is very well thought out!","postId":32,"user":{"id":16,"userna
me":"dpierrof"}},{"id":25,"body":"I can tell you’ve been practicing!","postId":4
4,"user":{"id":78,"username":"dbuist25"}},{"id":26,"body":"You’ve come a long wa
y!","postId":70,"user":{"id":82,"username":"kogilvy29"}},{"id":27,"body":"I can 
tell you’ve been paying attention!","postId":60,"user":{"id":74,"username":"ahin
ckes21"}},{"id":28,"body":"Reading this made my day!","postId":85,"user":{"id":8
5,"username":"kpondjones2c"}},{"id":29,"body":"This is very perceptive!","postId
":13,"user":{"id":30,"username":"kdulyt"}},{"id":30,"body":"What an accomplishme
nt!","postId":23,"user":{"id":36,"username":"dalmondz"}}],"total":340,"skip":0,"
limit":30}';                                                                    
                                                                                

1 row selected.

 

[Updated on: Thu, 28 September 2023 09:45]

Report message to a moderator

Re: Inserted JSON into Column is being Truncated [message #689141 is a reply to message #689133] Thu, 28 September 2023 06:57 Go to previous messageGo to next message
Duane
Messages: 557
Registered: December 2002
Senior Member
Thanks Barbara.  The file method won't work for me.  I was doing it this way to make sure I could even do it.  The PROD version will be similar but the JSON data (CLOB variable) will be passed into a Package/Procedure and then inserted into the table.

It sounds like I will need to do it the Michel way of breaking up the CLOB into smaller chunks of data.  Actually, I think the JSON data being passed to us won't exceed 32K but I don't want to take that chance.  I would rather write the Package/Procedure to handle anything larger than 32K.

Very odd that when you run the code it produces errors.  No errors on my end except for the truncating of the data.  I wonder if it's because I copied/pasted the CLOB data and it's being formatted in an odd way.  On my screen, the CLOB data is all on one line.  So no breaks in the data.  At least now I know I need to do something different to get this to work.
Re: Inserted JSON into Column is being Truncated [message #689143 is a reply to message #689141] Thu, 28 September 2023 09:45 Go to previous messageGo to next message
John Watson
Messages: 8931
Registered: January 2010
Location: Global Village
Senior Member
Quote:
Actually, I think the JSON data being passed to us won't exceed 32K but I don't want to take that chance.  I would rather write the Package/Procedure to handle anything larger than 32K.
In that case, you could enable extended strings: set max_string_size=extended, and run utl32k.sql
Re: Inserted JSON into Column is being Truncated [message #689144 is a reply to message #689143] Thu, 28 September 2023 10:40 Go to previous messageGo to next message
Duane
Messages: 557
Registered: December 2002
Senior Member
John Watson wrote on Thu, 28 September 2023 14:45
Quote:
Actually, I think the JSON data being passed to us won't exceed 32K but I don't want to take that chance.  I would rather write the Package/Procedure to handle anything larger than 32K.
In that case, you could enable extended strings: set max_string_size=extended, and run utl32k.sql
I wouldn't be able to run it and I doubt a DBA would want to run that on a PROD machine without it being tested by the software vendor.  You get what I mean.  Software vendors can decline support if changes have been made and it's not certified by the software vendor.

[Updated on: Thu, 28 September 2023 10:40]

Report message to a moderator

Re: Inserted JSON into Column is being Truncated [message #689145 is a reply to message #689141] Thu, 28 September 2023 11:33 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9090
Registered: November 2002
Location: California, USA
Senior Member
The forum may alter things somewhat, so what I copied and pasted may not have been identical to what you used.

The output is because I had linesize set to 80, so that it all displays on this forum without having to scroll.

If you will be receiving the data as a clob passed to a procedure, then that makes things a lot easier.  In the following demonstration, I have created a clob variable g_clob and populated it with the data that you posted here, so that I can use that to simulate passing a clob to a procedure.  I have then created a procedure to accept that clob value, convert it from clob to blob and insert it into your table.  I then executed the procedure and displayed the results.  

The first part below is not anything that you will need to do.  It is just showing how I populated the g_clob variable with the data that I copied into the file.

C##SCOTT@XE_21.3.0.0.0> -- load test data from file test.dat into clob variable g_clob
C##SCOTT@XE_21.3.0.0.0> -- in order to have the data in a clob that can be passed to your code
C##SCOTT@XE_21.3.0.0.0> -- simulating how it would be passed to your procedure:
C##SCOTT@XE_21.3.0.0.0> VARIABLE g_clob CLOB
C##SCOTT@XE_21.3.0.0.0> declare
  2    v_dest_lob      clob;
  3    v_src_bfile     bfile  := bfilename ('MY_DIR', 'test.dat');
  4    v_dest_offset   integer := 1;
  5    v_src_offset    integer := 1;
  6    v_lang_context  number := dbms_lob.default_lang_ctx;
  7    v_warning       number;
  8  begin
  9    dbms_lob.createtemporary (v_dest_lob, true);
 10    dbms_lob.fileopen  ( v_src_bfile, dbms_lob.file_readonly );
 11    dbms_lob.loadclobfromfile
 12  	 ( dest_lob	=> v_dest_lob
 13  	 , src_bfile	=> v_src_bfile
 14  	 , amount	=> dbms_lob.lobmaxsize
 15  	 , dest_offset	=> v_dest_offset
 16  	 , src_offset	=> v_src_offset
 17  	 , bfile_csid	=> dbms_lob.default_csid
 18  	 , lang_context => v_lang_context
 19  	 , warning	=> v_warning
 20  	 );
 21    dbms_lob.fileclose ( v_src_bfile );
 22    :g_clob := v_dest_lob;
 23    dbms_lob.freetemporary (v_dest_lob);
 24  end;
 25  /

PL/SQL procedure successfully completed.

The part below is what you need.

##SCOTT@XE_21.3.0.0.0> -- create table:
C##SCOTT@XE_21.3.0.0.0> CREATE TABLE TRANSACTIONS_TEST
  2  (
  3    ORDER_ID        NUMBER,
  4    ORDER_DOCUMENT  BLOB
  5  )
  6  /

Table created.

C##SCOTT@XE_21.3.0.0.0> -- create procedure:
C##SCOTT@XE_21.3.0.0.0> CREATE OR REPLACE PROCEDURE your_proc
  2    (p_clob	    IN CLOB)
  3  AS
  4    blob_content    BLOB;
  5    v_dest_offset   integer := 1;
  6    v_src_offset    integer := 1;
  7    v_lang_context  number := dbms_lob.default_lang_ctx;
  8    v_warning       number;
  9  BEGIN
 10    insert into transactions_test
 11  	 ( order_id,
 12  	   order_document
 13  	 )
 14  	 values
 15  	   ( 1000,
 16  	     empty_blob()
 17  	   )
 18  	 returning order_document into blob_content;
 19  	 dbms_lob.converttoblob
 20  	   ( dest_lob	  => blob_content
 21  	   , src_clob	  => p_clob
 22  	   , amount	  => dbms_lob.lobmaxsize
 23  	   , dest_offset  => v_dest_offset
 24  	   , src_offset   => v_src_offset
 25  	   , blob_csid	  => dbms_lob.default_csid
 26  	   , lang_context => v_lang_context
 27  	   , warning	  => v_warning);
 28  END your_proc;
 29  /

Procedure created.

C##SCOTT@XE_21.3.0.0.0> -- execute procedure passing clob variable g_clob that contains the data:
C##SCOTT@XE_21.3.0.0.0> EXEC your_proc (:g_clob)

PL/SQL procedure successfully completed.

C##SCOTT@XE_21.3.0.0.0> -- check results:
C##SCOTT@XE_21.3.0.0.0> SELECT order_id,
  2  	    TO_CLOB (order_document) order_document
  3  FROM   transactions_test
  4  /

  ORDER_ID                                                                      
----------                                                                      
ORDER_DOCUMENT                                                                  
--------------------------------------------------------------------------------
      1000                                                                      
{"comments":[{"id":1,"body":"This is some awesome thinking!","postId":100,"user"
:{"id":63,"username":"eburras1q"}},{"id":2,"body":"What terrific math skills you
’re showing!","postId":27,"user":{"id":71,"username":"omarsland1y"}},{"id":3,"bo
dy":"You are an amazing writer!","postId":61,"user":{"id":29,"username":"jissett
s"}},{"id":4,"body":"Wow! You have improved so much!","postId":8,"user":{"id":19
,"username":"bleveragei"}},{"id":5,"body":"Nice idea!","postId":62,"user":{"id":
70,"username":"cmasurel1x"}},{"id":6,"body":"You are showing excellent understan
ding!","postId":19,"user":{"id":97,"username":"cdavydochkin2o"}},{"id":7,"body":
"This is clear, concise, and complete!","postId":47,"user":{"id":22,"username":"
froachel"}},{"id":8,"body":"What a powerful argument!","postId":47,"user":{"id":
82,"username":"kogilvy29"}},{"id":9,"body":"I knew you could do it!","postId":64
,"user":{"id":31,"username":"smargiottau"}},{"id":10,"body":"Wonderful ideas!","
postId":4,"user":{"id":35,"username":"mbrooksbanky"}},{"id":11,"body":"It was a 
pleasure to grade this!","postId":2,"user":{"id":68,"username":"rstrettle1v"}},{
"id":12,"body":"Keep up the incredible work!","postId":50,"user":{"id":77,"usern
ame":"rkingswood24"}},{"id":13,"body":"My goodness, how impressive!","postId":37
,"user":{"id":28,"username":"xisherwoodr"}},{"id":14,"body":"You’re showing inve
ntive ideas!","postId":30,"user":{"id":57,"username":"bpickering1k"}},{"id":15,"
body":"You’ve shown so much growth!","postId":44,"user":{"id":76,"username":"cga
ber23"}},{"id":16,"body":"Interesting thoughts!","postId":71,"user":{"id":100,"u
sername":"pcumbes2r"}},{"id":17,"body":"I love your neat work!","postId":68,"use
r":{"id":37,"username":"nwytchard10"}},{"id":18,"body":"Doesn’t it feel good to 
do such great work?","postId":41,"user":{"id":31,"username":"smargiottau"}},{"id
":19,"body":"First-rate work!","postId":75,"user":{"id":60,"username":"dlambarth
1n"}},{"id":20,"body":"This is fascinating information!","postId":48,"user":{"id
":17,"username":"vcholdcroftg"}},{"id":21,"body":"You inspire me!","postId":29,"
user":{"id":5,"username":"kmeus4"}},{"id":22,"body":"This is right on target!","
postId":18,"user":{"id":31,"username":"smargiottau"}},{"id":23,"body":"What an a
stounding observation!","postId":73,"user":{"id":14,"username":"mturleyd"}},{"id
":24,"body":"This is very well thought out!","postId":32,"user":{"id":16,"userna
me":"dpierrof"}},{"id":25,"body":"I can tell you’ve been practicing!","postId":4
4,"user":{"id":78,"username":"dbuist25"}},{"id":26,"body":"You’ve come a long wa
y!","postId":70,"user":{"id":82,"username":"kogilvy29"}},{"id":27,"body":"I can 
tell you’ve been paying attention!","postId":60,"user":{"id":74,"username":"ahin
ckes21"}},{"id":28,"body":"Reading this made my day!","postId":85,"user":{"id":8
5,"username":"kpondjones2c"}},{"id":29,"body":"This is very perceptive!","postId
":13,"user":{"id":30,"username":"kdulyt"}},{"id":30,"body":"What an accomplishme
nt!","postId":23,"user":{"id":36,"username":"dalmondz"}}],"total":340,"skip":0,"
limit":30}';                                                                    
                                                                                

1 row selected.
Re: Inserted JSON into Column is being Truncated [message #689150 is a reply to message #689127] Thu, 28 September 2023 14:38 Go to previous messageGo to next message
Duane
Messages: 557
Registered: December 2002
Senior Member
Thank you for the second part.  That's exactly what I was needing.  I was struggling to come up with the method Michel had mentioned.

Just so I understand.  This method will work for really large CLOB's?  Let's say it's 1mb of data.  No problem.

As I mentioned, I don't believe the JSON data passed to us will be large but I really don't know.
Re: Inserted JSON into Column is being Truncated [message #689151 is a reply to message #689150] Thu, 28 September 2023 15:06 Go to previous messageGo to next message
Barbara Boehmer
Messages: 9090
Registered: November 2002
Location: California, USA
Senior Member
The problem with your original code, aside from the strange SQL*Plus errors that I got, was that it was trying to pass data in a manner that requires varchar2 or raw which are subject to the following limits:

varchar2:
"Maximum size: 4000 bytes, or 32767 bytes if the MAX_STRING_SIZE initialization parameter is set to EXTENDED"

raw:
"Maximum size: 2000 bytes, or 32767 bytes if the MAX_STRING_SIZE initialization parameter is set to EXTENDED"

When you do

clob_content clob := 'something';

it expects that 'something' to be varchar2.

Similarly, when you do

utl_raw.cast_to_raw(clob_content)

it expects the result to be a raw value.


If you use clob or blob the limit is:

Maximum size: (4 GB - 1) * DB_BLOCK_SIZE initialization parameter (8 TB to 128 TB)


You can read the limits for various data types for your version 19c here:

https://docs.oracle.com/en/database/oracle/oracle-database/19/refrn/datatype-limits.html
Re: Inserted JSON into Column is being Truncated [message #689157 is a reply to message #689151] Thu, 28 September 2023 17:12 Go to previous messageGo to next message
mathguy
Messages: 107
Registered: January 2023
Senior Member
So, just to be crystal clear:

CLOB can handle very large string values. If CLOB values are passed in (say, very large JSON), then there should be NO PROBLEM.

Your attempt fails for a pretty lame reason. The right-hand side of your assignment is a string literal (hard-coded, in single quotes). THAT is the issue. In Oracle there is no such thing as a CLOB literal. There are only "string literals" (or "text literals") which have some characteristics of VARCHAR2 and some of CHAR. In particular, the length limits apply (4000 bytes unless you have extended length limit, which you don't). It doesn't matter what the string will be assigned to (a CLOB in your case); the string literal itself is what is being rejected, before anything will be done with it.

Also, this should be a parse time error - the hard-coded string is already present when the parser sees the code, it doesn't depend on anything being seen at runtime.

If your attempt did not result in an error, but instead it just truncated the string (resulting very likely in an invalid JSON), that must be something your IDE did, before sending the query to the db server. Some IDE's have the (very bad in my opinion) habit of interfering between your code and the server you think you are sending your code to. Instead of sending EXACTLY what you typed - and letting the server complain if need be - they take it upon themselves to "fix" errors they think they found before sending it on. With results similar to what you reported.

What IDE are you using? Most people use SQL*Plus or SQL Developer; both should throw errors. (It's also possible that you are going through ApEx or some other higher-level product on top of the database; not sure how ApEx would react in such cases as I don't use it myself.)
Re: Inserted JSON into Column is being Truncated [message #689159 is a reply to message #689157] Thu, 28 September 2023 19:01 Go to previous message
Duane
Messages: 557
Registered: December 2002
Senior Member
mathguy wrote on Thu, 28 September 2023 22:12

What IDE are you using?
I was using TOAD 16.2.  I believe 16.3 is available but I haven't upgraded yet.

No errors at all.  It actually works but truncates the very last part of the data (:0,"limit":30}), which, makes it invalid JSON as you pointed out.

I never even thought about the CLOB and how I was assigning data to it.  Makes sense.

As I mentioned, I was doing this as a test to ensure I could do it.  I just found some JSON data that seemed large and took it from there.  The actual code will be a Package/Procedure utilizing a CLOB for the JSON data being passed in.

Thanks for your help.
Previous Topic: Count non workink days between two dates, from calendar table
Next Topic: Grouping similar strings together using UTL_MATCH Jaro-Winkler
Goto Forum:
  


Current Time: Sat Apr 27 08:50:57 CDT 2024