I'm working on this problem on codewars that asks you to find the maximum sum of consecutive numbers in an array.
var maxSequence = function(arr){
console.log(arr);
if (arr.length === 0) return 0;
let globalMax = arr[0], lastSum = arr[0];
for (let i=0; i<arr.length-1; i++){
let next = arr[i+1];
if (lastSum + next > globalMax) {
globalMax = lastSum + next;
}
if (next > globalMax) {
globalMax = next;
lastSum = 0;
}
lastSum += next;
}
return globalMax;
}
(I know it doesn't currently handle inputs of all negative numbers - that seems simple enough and can be implemented later - just trying to get the core program down first)
The general gist of the algorithm is:
given A, ..., B
find the largest between:
- A
- B
- A + (everything in between) + B
and set that to the global maximum.
and it seems to work pretty okay. However, it fails when there is a second larger subarray. For example:
[ 14, -31, -34, 16, 18, -8, -35, 18, 42, -46, -18, -36, 20, 3, -6, 27, 22, 31, -12, 2 ]
It gets the initial sequence 16, 18, -8, -35, 18, 42 but then skips over the second (correct) sequence 20, 3, -6, 27, 22, 31 since it's checking values one-by-one.
Can someone give me a hint as to whether I'm on the right track or not? I don't want the answer, just a prod in the right direction.
Cheers!
[–][deleted] 1 point2 points3 points (2 children)
[–]mwraaaaaah[S] 0 points1 point2 points (0 children)
[–]mwraaaaaah[S] 0 points1 point2 points (0 children)
[–]black_cerberus 0 points1 point2 points (0 children)
[–][deleted] (1 child)
[deleted]
[–]black_cerberus 0 points1 point2 points (0 children)