- 3,723,865 Users
- 2,244,636 Discussions
- 7,850,739 Comments
Forum Stats
Discussions
Categories
- 16 Data
- 362.2K Big Data Appliance
- 7 Data Science
- 2.1K Databases
- 615 General Database Discussions
- 3.7K Java and JavaScript in the Database
- 32 Multilingual Engine
- 497 MySQL Community Space
- 7 NoSQL Database
- 7.7K Oracle Database Express Edition (XE)
- 2.8K ORDS, SODA & JSON in the Database
- 422 SQLcl
- 62 SQL Developer Data Modeler
- 185.1K SQL & PL/SQL
- 21.1K SQL Developer
- 2.4K Development
- 3 Developer Projects
- 32 Programming Languages
- 135.6K Development Tools
- 14 DevOps
- 3K QA/Testing
- 337 Java
- 10 Java Learning Subscription
- 12 Database Connectivity
- 72 Java Community Process
- 2 Java 25
- 12 Java APIs
- 141.2K Java Development Tools
- 8 Java EE (Java Enterprise Edition)
- 153K Java Essentials
- 135 Java 8 Questions
- 86.2K Java Programming
- 270 Java Lambda MOOC
- 65.1K New To Java
- 1.7K Training / Learning / Certification
- 13.8K Java HotSpot Virtual Machine
- 16 Java SE
- 13.8K Java Security
- 4 Java User Groups
- 22 JavaScript - Nashorn
- 18 Programs
- 147 LiveLabs
- 34 Workshops
- 10 Software
- 4 Berkeley DB Family
- 3.5K JHeadstart
- 5.7K Other Languages
- 2.3K Chinese
- 4 Deutsche Oracle Community
- 16 Español
- 1.9K Japanese
- 3 Portuguese
Getting a summation of results

Hello Everyone,
I have a working query and I am using a pivot. What I want to do is add another column that will sum my columns, but I am having trouble. Here is my query.
SELECT * FROM ( select upper(a.doc_type) as DOC_TYPE, to_char(a.date_time, 'yyyy-mm-dd') as DATES from tableA a where exists ( select 1 from tableB b where b.filename = 'ABCXYZ' and to_char(a.id) = b.id ) ) pivot ( count(*) for(DATES) in ( '2021-01-01', '2021-01-02', '2021-01-03' ) ) ORDER BY 1;
This is result that I get from the query.
DOC_TYPE | '2021-01-01' | '2021-01-02' | '2021-01-03' TYPE_1 | 1 | 3 | 2 TYPE_2 | 10 | 20 | 5 TYPE_3 | 100 | 500 | 60
I would like to add the sum at the end and it should be a summation of 2021-01-01 through 2021-01-03.
Desired results would look like...
DOC_TYPE | '2021-01-01' | '2021-01-02' | '2021-01-03' | SUM_TOTAL TYPE_1 | 1 | 3 | 2 | 6 TYPE_2 | 10 | 20 | 5 | 35 TYPE_3 | 100 | 500 | 60 | 660
I tried adding this to my query, but it gave me an error of "missing right parenthesis".
... pivot ( count(*) for(DATES) in ( '2021-01-01', '2021-01-02', '2021-01-03' ) sum( q'['2021-01-01']' + q'['2021-01-02']' + q'['2021-01-03']') ) ... Also tried ... pivot ( count(*) for(DATES) in ( '2021-01-01', '2021-01-02', '2021-01-03' ) sum('''2021-01-01''' + '''2021-01-02''' + '''2021-01-03''') ) ...
I am assuming I am referencing the column wrong, but I am not sure how to reference it. Thanks in advance for all the help.
Best Answer
-
Hi,
As Mathguy said, the column names are not 10 characters long, starting with 2 (e.g., "2021-01-01"); they are 12 characters long, starting with single-quote (e.g., "'2021-01-01'"). Another problem is that those column names cannot be used inside the PIVOT clause; you can use them in the SELECT clause.
I suggest using names that don't require double_quotes, such as d_2021_01_01. If you must use non-standard names, then do something like this:
WITH data_to_pivot AS ( select upper (a.doc_type) as DOC_TYPE , to_char(a.date_time, 'yyyy-mm-dd') as DATES from tableA a where exists ( select 1 from tableB b where b.filename = 'ABCXYZ' and to_char(a.id) = b.id ) ) SELECT p.* , "'2021-01-01'" + "'2021-01-02'" + "'2021-01-03'" AS total FROM data_to_pivot PIVOT ( COUNT (*) FOR (dates) IN ( '2021-01-01', '2021-01-02', '2021-01-03' ) ) p ORDER BY doc_type;
If you'd care to post CREATE TABLE and INSERT statements for some sample data, then I could test this.
Answers
-
To get the sum of numbers in a ROW, you just use the + operator with the column names as operands; don't use SUM(...).
Then: weird column names must be enclosed in DOUBLE-QUOTES, like so:
"2021-01-08"
(those are not two consecutive single-quotes, but a single double-quote character on each side of the column name).
Beware of null, too; unlike the SUM(...) aggregate (which works only on columns -adding values from all rows - the opposite of what you need here), the simple addition operator does not ignore null. Instead, it returns null as the answer. If you want null to be treated as "zero", then you will need to wrap each column name (double-quoted as shown) within NVL(...., 0) before applying the addition operator +
In the end. the formula for the extra column might look something like
nvl("2021-01-01", 0) + nvl("2021-01-02", 0) + ... as total
or without the NVL if you don't expect null, or if you don't want null to be treated as zero
"2021-01-01" + "2021-01-02" + .... as total
EDIT:
Looking at it again, you are simply letting PIVOT create the column names. So they aren't simply 2020-01-01 for example; they are already enclosed in single-quotes. If that works for you, fine; then the column names are referenced like so:
"'2020-01-01'"
Happily the code formatting makes it easy to read this. There are double-quote characters at the extremities; then single-quote characters which are part of the column names (as generated by PIVOT).
The alternative is to generate column names without single-quotes. You can do that in PIVOT:
pivot ... for dates in ('2021-01-01' as "2021-01-01", .....)
then PIVOT will use those column names instead of the automatically generated ones. Note that here, too, you must use double-quotes around the weird column names.
-
Thanks Mathguy.
I tried the double quotes, but it still produces the same error message (missing right parenthesis). I am not too worried about the null because my count will interpret it to 0 if there are none for that day.
pivot ( count(*) for(DATES) in ( '2021-01-01', '2021-01-02', '2021-01-03' ) "2021-01-01" + "2021-01-02" + "2021-01-03" )
I think it has to do with my pivot and the "dynamic? (not sure if this is the right word for it)" date columns ('2021-01-01', '2021-01-02', '2021-01-03').
-
-
Hi,
As Mathguy said, the column names are not 10 characters long, starting with 2 (e.g., "2021-01-01"); they are 12 characters long, starting with single-quote (e.g., "'2021-01-01'"). Another problem is that those column names cannot be used inside the PIVOT clause; you can use them in the SELECT clause.
I suggest using names that don't require double_quotes, such as d_2021_01_01. If you must use non-standard names, then do something like this:
WITH data_to_pivot AS ( select upper (a.doc_type) as DOC_TYPE , to_char(a.date_time, 'yyyy-mm-dd') as DATES from tableA a where exists ( select 1 from tableB b where b.filename = 'ABCXYZ' and to_char(a.id) = b.id ) ) SELECT p.* , "'2021-01-01'" + "'2021-01-02'" + "'2021-01-03'" AS total FROM data_to_pivot PIVOT ( COUNT (*) FOR (dates) IN ( '2021-01-01', '2021-01-02', '2021-01-03' ) ) p ORDER BY doc_type;
If you'd care to post CREATE TABLE and INSERT statements for some sample data, then I could test this.
-
I didn't see it the first time, so I may have already been in the thread before the edit. Thanks again for the help though. I did use the double quotes and got rid of the single quote in the column name like you mentioned. I still have the wrong syntax for the "sum" column.
-
Thanks Frank! Your query did the trick. It looks like I was putting the SUM part in the wrong spot in the query to begin with.
-
Lol, I was looking so closely at the trees, I didn't see the problem with the forest.
I saw the SUM(...) thing, and the way you were trying to reference the column names, and I didn't see where you were doing all these things. As Mr. Kulash pointed out, you can only do that in the SELECT clause, not in PIVOT.
-
Lol, no worries! I was wondering why you were focused on that part, but I figured there must be some type of lesson you were trying to teach me. Thanks again for the help.
-
HI, @User_OMEF8
This illustrates one of the many reasons why you need to include a little sample data whenever you have a question, so the people who want to help you can test their ideas. You may have several separate problems with a piece of code; you want them all fixed, not just the most obvious one.
-
That is a very valid point. I will try to provide those the next time I post a question. Thanks again for your help. I greatly appreciate it.