Ir al contenido

Documat


Resumen de Distributed constraint optimization related with soft arc consistency

Patricia Gutierrez Faxas

  • Distributed Constraint Optimization Problems (DCOPs) can be used for modeling many multi-agent coordination problems. DCOPs involve a finite number of agents, variables and cost functions. The goal is to find a complete variable assignment with minimum global cost. This is achieved among several agents handling the variables and exchanging information about their cost evaluation until an optimal solution is found. Recently, researchers have proposed several distributed algorithms to optimally solve DCOPs. In the centralized case, techniques have been developed to speed up constraint optimization solving. In particular, search can be improved by enforcing soft arc consistency, which identifies inconsistent values that can be removed from the problem. Some soft consistency levels proposed are AC, FDAC and EDAC.

    The goal of this thesis is to include soft arc consistency techniques in DCOP resolution. We show that this combination causes substantial improvements in performance. Soft arc consistencies are conceptually equal in the centralized and distributed cases. However, maintaining soft arc consistencies in the distributed case requires a different approach. While in the centralize case all problem elements are available in the single agent performing the search, in the distributed case agents only knows some part of the problem and they must exchange information to achieve the desired consistency level. In this process, the operations that modify the problem structures should be done in such a way that partial information of the global problem remains coherent on every agent.

    In this thesis we present three main contributions to optimal DCOP solving. First, we have studied and experimented with the complete solving algorithm BnB-ADOPT. As result of this work, we have improved it to a large extent. We show that some of BnB-ADOPT messages are redundant and can be removed without compromising optimality and termination. Also, when dealing with cost functions of arity higher than two, some issues appear in this algorithm. We propose a simple way to overcome them obtaining a new version for the n-ary case. In addition, we present the new algorithm ADOPT($k$), which generalizes the algoritms ADOPT and BnB-ADOPT. ADOPT ($k$) can perform a search strategy either like ADOPT, like BnB-ADOPT or like a hybrid of both depending on the $k$ parameter.

    Second, we have introduced soft arc consistency techniques in DCOPs, taking BnB-ADOPT$^+$ as our base solving algorithm. During the search process, we enforce the soft arc consistency levels AC and FDAC, under the limitation that only unconditional deletions are propagated, obtaining important benefits in communication and computation. We enforce FDAC considering multiple orderings of the variables obtaining savings in communication. Also, we propose DAC by token passing, a new way to propagate deletions during distributed search. Experimentally, this strategy turned out to be competitive when compared to FDAC.

    Third, we explore the inclusion of soft global constraints in DCOPs. We believe that soft global constraints enhance DCOP expressivity. We propose three different ways to include soft global constraints in DCOPs and extend the solving algorithm BnB-ADOPT$^+$ to support them. In addition, we explore the impact of soft arc consistency maintenance in problems with soft global constrains.

    Experimentally, we measure the efficiency of the proposed algorithms in several benchmarks commonly used in the DCOP community.


Fundación Dialnet

Mi Documat