Skip to Main Content

New to Java

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Column sums in a jagged 2d array

843789Feb 22 2010 — edited Feb 22 2010
i'm trying to add the values in the columns of a 2d array, and i know exactly what to do. the problem comes when the array is jagged, specifically when the firs row is shortest. when this happens, the program exits the first for loop because, well, i told it to when row<data.length.

the only way i seem to be able to have it print out without an out of bounds exception is by setting rowlength[0] in the inner loop. i have tried using if statements and even a third nested loop to skip the row once the maximum number of columns is reached, but had no success... perhaps i am going about this the wrong way? please help, i'm going insane :)
 int[][]data = { {3, 2, 5},
                    {1, 4, 4, 8, 13},
                    {9, 1, 0, 2},
                    {0, 2, 6, 3, -1, -8}};

    int[]rowlength = new int[data.length];
    int sum = 0;      
  
    
    for (int row=0; row<data.length;row++)
    {
        rowlength[row]=data[row].length;          
    }
    
        for (int col=0; col<rowlength[0];col++)
        {
            sum=0;
                for (int row=0; row<data.length;row++)
                {                
                    sum = sum + data [row][col];   
                }
            System.out.println(sum);
        }
    
   }
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on Mar 22 2010
Added on Feb 22 2010
16 comments
2,447 views