There is a wooden stick with length L >= 1, we need to cut it into pieces, where the cutting positions are defined in an int array A. The positions are guaranteed to be in ascending order in the range of [1, L - 1]. The cost of each cut is the length of the stick segment being cut. Determine the minimum total cost to cut the stick into the defined pieces.
Examples
- L = 10, A = {2, 4, 7}, the minimum total cost is 10 + 4 + 6 = 20 (cut at 4 first then cut at 2 and cut at 7)
Solution: 中心开花
[0, 2, 4, 7, 10] →M[0][4]
M[i][j] : min cost of cutting between index i and j
base case: M[i][i + 1] = 0
induction rule : 从size = 2出发,逐渐增大size,遍历(i, i + size)间的j, temp = end - start + M[i][j] + M[j][i + size],
M[i][i + size] = min(M[i][i + size], temp), 注意index关系
public int minCost(int[] cuts, int length) {
int[][] cost = new int[cuts.length + 2][cuts.length + 2];
for (int size = 2; size < cost.length; size++) {
for (int i = 0; i < cost.length - size; i++) {
cost[i][i + size] = Integer.MAX_VALUE;
for (int j = i + 1; j < i + size; j++) {
int end = i + size >= cuts.length ? length : cuts[i + size - 1];
int start = i == 0 ? 0 : cuts[i - 1];
int temp = end - start + cost[i][j] + cost[j][i + size];
cost[i][i + size] = Math.min(cost[i][i + size], temp);
}
}
}
return cost[0][cost.length - 1];
}