Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
249 views
in Technique[技术] by (71.8m points)

python - Is there a way to impose memory limit on google or tools pywrapknapsack_solver?

I have created 5,000 instances of a knapsack dataset with each dataset including 100 weights, 100 values, the total weight and a capacity percentage. The weights and values have a normal distribution with mean 100 and std 5. The input file is a pickle formatted as [tensor(weights), tensor(values), tensor(total weight), capacity percentage]. The pickle input file

I have created a python program that takes in the input file, uses the google or tools pywrapknapsack_solver to solve the problem, and outputs the solution as a one-hot vector and the solution time.

import torch 
from ortools.algorithms import pywrapknapsack_solver
import time
import pickle
import argparse

def translate_soln(packed_items, num_items):
    out = [0]*num_items
    for i in packed_items:
        out[int(i)] = 1
    return out


def knapsack_solution_generator(input_path, output_path):
    input_data = pickle.load( open(input_path, 'rb'))

    weights = input_data[0].tolist()
    values = input_data[1].tolist()

    total_weight = input_data[2].item()
    capacity_percentage = input_data[3]
    capacity = [total_weight * (capacity_percentage / 100)]

    num_items = len(weights)

    solver = pywrapknapsack_solver.KnapsackSolver(
            pywrapknapsack_solver.KnapsackSolver.
            KNAPSACK_MULTIDIMENSION_BRANCH_AND_BOUND_SOLVER, 'KnapsackExample')
    solver.set_time_limit(7200)
    
    start_time = time.perf_counter()
    solver.Init(values, [weights, capacity)
    
    computed_value = solver.Solve()
    end_time = time.perf_counter()

    solution_time = round(end_time - start_time, 5)

    packed_items = []
    for j in range(num_items):
        if solver.BestSolutionContains(j):
            packed_items.append(j)

    solution = translate_soln(packed_items, num_items)
    pickle.dump((solution, solution_time), open(output_path, 'wb'))
    

if __name__ == "__main__":
    parser = argparse.ArgumentParser()

    parser.add_argument('--input', type=str, help="Input file path")
    parser.add_argument('--output', type=str, help="Output file path")
    
    args = parser.parse_args()
    input_path = args.input
    output_path = args.output

    knapsack_solution_generator(input_path, output_path)

I have been using a Linux virtual machine to solve all of these problems independently. However, when the machine attempts to solve problems with capacity percentages of the range 10-90%, there are certain problem instances that will not solve. When looking at the output log, there is a "Killed error". To dig deeper I had the machine attempt to solve the problem instances again as I monitored with the linux top command and saw that the problems that would not solve were killed at 80% memory after about 20 minutes. However, certain problems would be solved in 2 hours. What is causing the solver to use up too much memory? Is there a way to implement a memory constraint?


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)
等待大神答复

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...