# unifying_satbased_approaches_to_maximum_satisfiability_solving__63ff1ab6.pdf Journal of Artificial Intelligence Research 80 (2024) 931-976 Submitted 01/2024; published 07/2024 Unifying SAT-Based Approaches to Maximum Satisfiability Solving Hannes Ihalainen hannes.ihalainen@helsinki.fi Jeremias Berg jeremias.berg@helsinki.fi Matti J arvisalo matti.jarvisalo@helsinki.fi Department of Computer Science, University of Helsinki, Finland Maximum satisfiability (Max SAT), employing propositional logic as the declarative language of choice, has turned into a viable approach to solving NP-hard optimization problems arising from artificial intelligence and other real-world settings. A key contributing factor to the success of Max SAT is the rise of increasingly effective exact solvers that are based on iterative calls to a Boolean satisfiability (SAT) solver. The three types of SATbased Max SAT solving approaches, each with its distinguishing features, implemented in current state-of-the-art Max SAT solvers are the core-guided, the implicit hitting set (IHS), and the objective-bounding approaches. The objective-bounding approach is based on directly searching over the objective function range by iteratively querying a SAT solver if the Max SAT instance at hand has a solution under different bounds on the objective. In contrast, both core-guided and IHS are so-called unsatisfiability-based approaches that employ a SAT solver as an unsatisfiable core extractor to determine sources of inconsistencies, but critically differ in how the found unsatisfiable cores are made use of towards finding a provably optimal solution. Furthermore, a variety of different algorithmic variants of the core-guided approach in particular have been proposed and implemented in solvers. It is well-acknowledged that each of the three approaches has its advantages and disadvantages, which is also witnessed by instance and problem-domain specific runtime performance differences (and at times similarities) of Max SAT solvers implementing variants of the approaches. However, the questions of to what extent the approaches are fundamentally different and how the benefits of the individual methods could be combined in a single algorithmic approach are currently not fully understood. In this work, we approach these questions by developing Uni Max SAT, a general unifying algorithmic framework. Based on the recent notion of abstract cores, Uni Max SAT captures in general core-guided, IHS and objective-bounding computations. The framework offers a unified way of establishing quite generally the correctness of the current approaches. We illustrate this by formally showing that Uni Max SAT can simulate the computations of various algorithmic instantiations of the three types of Max SAT solving approaches. Furthermore, Uni Max SAT can be instantiated in novel ways giving rise to new algorithmic variants of the approaches. We illustrate this aspect by developing a prototype implementation of an algorithmic variant for Max SAT based on the framework. 1. Introduction The declarative paradigm of maximum satisfiability (Max SAT) (Li & Many a, 2021; Bacchus et al., 2021) is today a viable approach to solving NP-hard optimization problems arising 2024 The Authors. Published by AI Access Foundation under Creative Commons Attribution License CC BY 4.0. Ihalainen, Berg, & J arvisalo from AI and other real-world settings. Much of the success of Max SAT is due to advances in practical algorithms for Max SAT and their fine-grained implementations as Max SAT solvers. Max SAT solvers can be categorized into exact (or complete ) and inexact (or incomplete ) solvers. Inexact solvers are typically based on stochastic local search (Jiang et al., 1995; Cai et al., 2014; Lei & Cai, 2018; Chu et al., 2023) and/or combinations of techniques from exact solvers (Joshi et al., 2019; Berg et al., 2019; Nadel, 2020), and are in general geared towards finding good solutions in relatively short time instead of providing guarantees on finding provable optimal solutions. In contrast, exact solvers are guaranteed to find optimal solutions, given enough runtime resources. A majority of research on developing increasingly effective Max SAT solvers has to date focused on exact solvers which are also the focus of this work. Exact Max SAT solvers are based on one or more types of algorithmic approaches. As the main focus of this work, a great majority of modern exact Max SAT solvers are SATbased (Bacchus et al., 2021), implementing Boolean satisfiability (SAT) based Max SAT algorithms making use of SAT solvers as real-world NP-oracles in an iterative fashion (Morgado et al., 2013; Ans otegui et al., 2013). Apart from the mainstream SAT-based Max SAT solvers, we note that branch-and-bound based Max SAT solvers most recently integrating specific search techniques from SAT solving, such as clause learning have also been developed (Planes, 2003; Li et al., 2005, 2006; Heras et al., 2008; Abram e & Habet, 2014, 2016; Li et al., 2021, 2022; Li & Many a, 2021). The underlying algorithmic approaches implemented in the various SAT-based Max SAT solvers today can be categorized into three types: the so-called core-guided approach (Fu & Malik, 2006; Marques-Silva & Planes, 2007; Heras et al., 2011; Ans otegui et al., 2013; Morgado et al., 2013, 2014; Narodytska & Bacchus, 2014; Alviano et al., 2015; Ans otegui et al., 2016; Ans otegui & Gab as, 2017), the implicit hitting set (IHS) approach (Davies & Bacchus, 2011, 2013; Saikko et al., 2016), and the objective-bounding approach (Fu & Malik, 2006; E en & S orensson, 2006; Berre & Parrain, 2010; Koshimura et al., 2012; Heras et al., 2011; Ignatiev et al., 2014). The objective-bounding approach is based on directly searching over the objective function range by iteratively querying a SAT solver if the Max SAT instance at hand has a solution under different bounds on the objective using different strategies such as model-improving search (E en & S orensson, 2006; Berre & Parrain, 2010; Koshimura et al., 2012), binary search (Fu & Malik, 2006; Heras et al., 2011; Piotr ow, 2020) or progression-based search (Ignatiev et al., 2014). In contrast, both core-guided and IHS are unsatisfiability-based approaches, relying on iteratively extracting sources of inconsistencies in terms of unsatisfiable cores using a SAT solver (E en & S orensson, 2003) as a core-extracting decision oracle. However, core-guided and IHS solvers deal with cores extracted during search differently. Core-guided algorithms reformulate the current working instance starting with the input Max SAT instance to take into account the so-far extracted cores in subsequent search iterations towards an optimal solution. The various different core-guided algorithms differ in the way in which the reformulation steps change the working instance. In contrast, in each iteration of IHS search, the SAT solver is invoked on a subset of clauses of the input instance, without reformulation-style modifications to the instance. The choice of the subset of constraints to consider in each iteration is dictated by computing a (minimum-cost) hitting set of the so-far accumulated set of cores. Unifying SAT-Based Approaches to Maximum Satisfiability Solving In practice, state-of-the-art core-guided, IHS and objective-bounding (especially modelimproving) Max SAT solvers are all competitive in terms of runtime performance. However, the relative performance on distinct problem domains can vary noticeably between solvers implementing a specific approach (Bacchus et al., 2019). The fundamental reasons behind this are not well understood, despite recent advances showing that for a specific classic variant of core-guided search, the cores extracted from the reformulated working formulas during core-guided search are tightly related to cores extracted in IHS search on the original instance (Bacchus & Narodytska, 2014; Narodytska & Bjørner, 2022). Furthermore, fundamental insights into how to combine the best of each of the three types of algorithmic approaches are currently lacking. In this work, we develop a general algorithmic framework that captures core-guided, IHS, and objective-bounding computations in a unifying way. The framework is based on the recently-proposed notion of abstract cores originally presented as a performanceimproving variant of IHS for Max SAT (Berg et al., 2020) that brings a flavor of core-guided reformulation into the representation of the hitting set problems solved during IHS search. While the correctness of the objective-bounding approach is relatively straightforward to establish directly, this is not generally the case for fine-grained variants of the core-guided approach which would also translate to non-trivial individual correctness proofs for any non-trivial combinations of, e.g., the core-guided and IHS approaches. Our framework provides a unified way of establishing the correctness of variants of core-guided and IHS approaches. The framework also has the potential of being instantiated in novel ways, thereby giving rise to new variants of provably-correct Max SAT algorithms. While the main focus of this work is evidently on the general formal algorithmic framework, as an illustration of its potential for obtaining novel types of unsatisfiability-based algorithms we more shortly outline and provide a prototype implementation of a core-guided variant for Max SAT obtained through the framework. In terms of related work, the motivations underlying the Uni Max SAT framework are in part similar to generic frameworks developed for capturing reasoning performed by modern SAT solvers and closely related solver technologies. These include the inprocessing rules framework (J arvisalo et al., 2012; Fazekas et al., 2019) for capturing the various types of reasoning steps applied by inprocessing SAT solvers as well as the DPLL(T) framework (Nieuwenhuis et al., 2006) and its extensions which have been developed for finegrained formalization of satisfiability modulo theories (SMT) solvers (Cimatti et al., 2010; Barrett et al., 2021; Bjørner & Fazekas, 2023), specific types of optimization approaches in SMT (Fazekas et al., 2018), as well as, e.g., reasoning performed by answer set (ASP) solvers (Baselice et al., 2005; Gebser et al., 2009). Both the inprocessing and the DPLL(T) framework take the view of formalizing solving steps as transition systems, describing the possible next-state transitions from a current solver state. The inprocessing framework is instantiated by specific redundancy notions which have more recently been generalized to the realm of Max SAT (Ihalainen et al., 2022) which themselves can generally cover the various reasoning techniques applied in SAT solvers. However, such redundancy notions on the Max SAT level do not allow for directly capturing the multitude of SAT-based Max SAT algorithms. In contrast, a key motivation behind the Uni Max SAT framework is to cover all of the three main SAT-based Max SAT solving approaches, also with the aim of providing Ihalainen, Berg, & J arvisalo a unifying view towards novel types of Max SAT algorithms that would combine aspects of the different approaches in correct ways. A shorter preliminary version of this work was presented at the IJCAI 2023 conference (Ihalainen et al., 2023). The present article considerably revises and extends on the work reported at IJCAI 2023. Most notably, we here thoroughly revise the details of the Uni Max SAT framework. As a result, the framework now allows for capturing not only core-guided and implicit hitting set approaches to Max SAT, but more generally SAT-based Max SAT solving, including what we will refer to as objective-bounding Max SAT algorithms, instantiations of which include the model-improving approach as well as binary and progression-based search for Max SAT. The revised framework is also arguably cleaner in terms of being more directly connected with how the various SAT-based Max SAT algorithms perform search. As a result, formal proofs have been thoroughly revised and further details included, including formal explanations of how the general framework captures several further algorithmic instantiations of SAT-based Max SAT approaches. The rest of this article is organized as follows. We start with a background on maximum satisfiability (Section 2) and an overview of the objective-bounding, core-guided, and implicit hitting set approaches to Max SAT solving (Section 3). Then, as the main contribution of this work, we detail the Uni Max SAT framework (Section 4) and argue that it can simulate the behavior of the objective-bounding and implicit hitting set approaches (Section 5) as well as the various variants of the core-guided approaches proposed in the literature so far (Section 6). Before conclusions, we further detail a novel variant of the core-guided approach to illustrate the use of the Uni Max SAT framework to formulate new algorithmic variants (Section 7). 2. Maximum Satisfiability For a Boolean variable x there are two literals, x and x. A clause C = l1 . . . ln is a disjunction of literals, and a conjunctive normal form (CNF) formula is a set of clauses F = {C1, . . . , Cm}. For a clause C, the set var(C) consists of variables x for which either x C or x C. An assignment τ maps variables to 1 (true) or 0 (false). Assignments extend to a literal l, clause C and formula F standardly by τ( l) = 1 τ(l), τ(C) = max{τ(l) | l C} and τ(F) = min{τ(C) | C F}. An assignment τ satisfies (or is a solution to) F if τ(F) = 1. Interchangeably, we may treat τ as the set of literals τ assigns to 1. Then l τ denotes τ(l) = 1 and l τ denotes τ(l) = 0. The set of variables assigned by τ is var(τ) = {x | x τ or x τ}; τ is complete for F if it assigns each variable in F a value, and otherwise partial. An assignment τ that assigns each literal in a clause C to 0 falsifies C, denoted by τ C. Pseudo-Boolean constraints are linear inequalities of form P i cixi k, where each xi is a Boolean variable, each ci a positive coefficient, and k a positive constant. The constraint P i cixi k is satisfied by an assignment τ if P i ciτ(xi) k. When we do not make assumptions about how exactly pseudo-Boolean constraints are represented as CNF formulas1, we abstractly use as CNF(P i cixi k) to denote a CNF formula that is satisfied by an assignment τ iff P i ciτ(xi) k. Taking a name ok to indicate whether a 1. Various CNF encodings of pseudo-Boolean constraint have been proposed (Warners, 1998; Bailleux & Boufkhad, 2003; Sinz, 2005; E en & S orensson, 2006; Bailleux et al., 2009; Codish & Zazon-Ivry, 2010; Unifying SAT-Based Approaches to Maximum Satisfiability Solving pseudo-Boolean constraint is satisfied, we also use as CNF(P i cixi k ok) to denote a (CNF-representation of) a reified pseudo-Boolean constraint, i.e., a CNF formula that is satisfied by any assignment τ that sets τ(ok) = 1 iff P i ciτ(xi) k. An important special case of pseudo-Boolean constraints is the so-called cardinality constraints P i xi k, which are pseudo-Boolean constraints where each coefficient is 1. Notice how the ok variable of a reified cardinality constraint as CNF(P i xi k ok) essentially counts whether the number of xi variables assigned to 1 is more or less than k. An instance F = (F, O) of (weighted partial) maximum satisfiability (Max SAT for short) consists of a CNF formula F and an objective function O = P i wibi + W lb under minimization, where wi are positive integers and bi are variables of F. Notice that we here include for convenience the constant term W lb. This allows for explicitly representing lower bounds on costs of solutions as computed by core-guided Max SAT algorithms (as detailed in Section 3.2). Remark 1. The definition of Max SAT in terms of a CNF formula and an objective we use in this work is equivalent to the arguably more classical (clausal) definition of Max SAT in terms of hard and weighted soft clauses in the following sense. Going from the objective function representation to the clausal representation, the clauses remain hard clauses, and each term wibi in the objective function O is equivalently represented as a soft clause ( bi), wi , i.e., a unit soft clause ( bi) with weight wi. To the other direction, any clausal instance of Max SAT can be converted to an instance where each soft clause is a unit clause by the blocking variable transformation (Bacchus et al., 2021) standardly employed in SAT-based Max SAT solvers before search: introduce a fresh variable bi for each non-unit soft clause Ci with weight wi and replace Ci with the hard clause Ci bi and the soft clause ( bi), wi . After this transformation, the introduced soft unit clauses are evidently equivalent to the objective function P i wibi. For example, consider the following (clausal) Max SAT instance (FH, FS) consisting of the hard clauses FH = {( x b1), (y z b2)} and the soft clauses FS = { ( y x), 1 , ( b1), 2 , ( b2), 5 , ( z x b1), 3 }. Applying the blocking variable transformation for each non-unit soft clause results the set of hard clauses F b H = {( x b1), (y z b2), ( y x b3), ( z x b1 b4)} and the set of soft clauses F b S = { ( b1), 2 , ( b2), 5 , ( b3), 1 , ( b4), 3 }. This instance can be equivalently represented using the objective function representation as the instance F = (F b H, O) with O = 2b1 + 5b2 + b3 + 3b4. The set var(O) consists of variables that occur in O. A complete satisfying assignment τ to F is a solution to F and has cost O(τ) = P i wiτ(bi) + W lb. A solution is optimal if there are no solutions with lower costs. The cost of optimal solutions to a Max SAT instance F is denoted by opt(F). Example 1. Consider the Max SAT instance F = (F, O) with F = {(b1 b2 x), ( x b3), (b3 b4 b5)} and O = b1 + b2 + 3b3 + b4 + 2b5. An optimal solution to F is τ = { b1, b2, x, b3, b4, b5} to F, assigning all variables except b2, b4 to 0. The cost of τ is O(τ) = τ(b1) + τ(b2) + 3τ(b3) + τ(b4) + 2τ(b5) = 2. As ın et al., 2011; H olldobler et al., 2012; Ab ıo et al., 2013; Ogawa et al., 2013; Manthey et al., 2014; Joshi et al., 2015; Paxian et al., 2018; Karpinski & Piotr ow, 2019). Ihalainen, Berg, & J arvisalo Algorithm 1 The objective-bounding search approach to Max SAT Input: A Max SAT instance F = (F, O) where O = P i wibi. Output: An optimal solution τ to F. 2: while true do 3: w = Next-value-to-test() 4: (res, , τ) = Extract-Core(F as CNF(P i wibi k ow), {ow}) 5: if res = true and O(τ) < O(τ ) then τ = τ 6: if res = false and w = O(τ ) 1 then return τ A clause C is a(n unsatisfiable) core of a Max SAT instance F = (F, O) if all literals in C are objective variables (i.e., var(C) var(O)) and every solution to F satisfies C (i.e., F logically entails C). Example 2. The clauses (b1 b2 b3) and (b3 b4 b5) are two of the cores of the Max SAT instance detailed in Example 1. 3. SAT-Based Approaches to Max SAT We develop a unifying algorithmic framework for modern SAT-based algorithms, capturing forms of objective-bounding search, core-guided algorithms, and algorithms based on the implicit hitting set (IHS) approach. As necessary background, we describe each of these approaches in general terms; practical solver implementations employ various heuristics and optimizations that do not affect our main contributions and, as such, are not detailed here. Common to the three types of modern SAT-based Max SAT algorithms is the use of an incremental SAT solver that can determine the satisfiability of CNF formulas under different sets of assumptions (E en & S orensson, 2003). Given a CNF formula F and a partial assignment γA (constituting a set of assumptions, represented as a set of literals), we abstract the SAT solver into the subroutine Extract-Core that returns a triplet (res, C, τ). Here res= true if there is a solution τ γA to F. If there is no such solution, res= false and C is a clause over a subset of the variables in γA that is entailed by F. Invoked on F under a set of assumptions γA for which F γA is unsatisfiable, modern SAT solvers provide such a clause C at termination without computational overhead. In the context of SAT-based Max SAT algorithms, such a C found during Max SAT search will be unsatisfiable core of the current working instance. 3.1 Objective-Bounding Search The so-called objective-bounding search algorithms captured by Algorithm 1 compute an optimal solution to a given Max SAT instance (F, O) by iteratively selecting a value w (Line 3) and querying Extract-Core for a solution τ to F satisfying O(τ) w (Line 4). In practice, the query is formed by encoding a pseudo-Boolean constraint enforcing the bound w on the objective O. The solution of lowest cost found so far is stored in τ and updated whenever the Extract-Core returns res= true and a new solution (Line 5). Unifying SAT-Based Approaches to Maximum Satisfiability Solving The search terminates when the algorithm establishes that there is no solution τ for which O(τ) O(τ ) 1 (Line 6). The different objective-bounding search algorithms differ mainly in the order in which different value choices for w are selected, which also is reflected in the termination criterion. The main approaches proposed in the literature are the so-called solution-improving (sometimes called SAT-UNSAT) search (E en & S orensson, 2006; Berre & Parrain, 2010; Koshimura et al., 2012; Paxian et al., 2018), UNSAT-SAT (lower-bounding) search, binary search (Fu & Malik, 2006; Heras et al., 2011; Piotr ow, 2020) and progression-based search (Ignatiev et al., 2014). Solution-improving search is an upper-bounding approach that starting from some upper bound such as the sum of all coefficients of the objective in each iteration sets w to be one lower than the cost of the best currently known solution τ . This strategy guarantees that the algorithm terminates when Extract-Core returns res= false . Solutionimproving search is today the most widely employed objective-bounding search algorithm. In contrast, UNSAT-SAT search is a lower-bounding approach that starts from k = 0 and increments w each time Extract-Core returns res= false . The search terminates when Extract-Core returns res= true .2 Binary search algorithms perform binary search over the range of the objective function, maintaining both an upper and a lower bound on optimal cost. The upper bound is updated whenever Extract-Core returns res= true , and the lower bound whenever Extract-Core returns res= false .3 Finally, progression-based search can be seen as a combination of UNSAT-SAT and binary search. Progression-based search initially tests the values w = (20 1, 21 1, 22 1, . . . ) until Extract-Core reports res= true and a solution on a particular ith iteration. At this stage, the algorithm has determined that the optimal cost is between 2i 1 1 and 2i 1 and switches to binary search using these as the initial lower and upper bounds. 3.2 Core-Guided Search Turning to core-guided search, we outline as Algorithm 2 a general abstraction of the coreguided approach to computing an optimal solution to a given Max SAT instance F = (F, O). The algorithm first initializes a set Constraints of cardinality constraints as the empty set and a reformulated objective function OR as the objective function O (Lines 1 2). In each iteration of the main loop (Lines 3 9) a SAT solver is queried for a solution τ that (i) satisfies all clauses in F and all of the cardinality constraints in Constraints and (ii) falsifies all objective variables of the current reformulated objective OR, i.e., OR(τ) = W lb (Lines 4 5). If there is such a τ, it is returned as an optimal solution to the input Max SAT instance (Line 6). Otherwise, a core C of (F Constraints, OR) is obtained. The core is then relaxed (Lines 7 9) by transforming the current working instance in a way 2. Today, UNSAT-SAT search is not commonly used. This is mainly because core-guided algorithms can be seen as refined versions of UNSAT-SAT search and generally outperform UNSAT-SAT search in practice. 3. In theory, binary search has the desirable property of guaranteed termination within a logarithmic number of calls to Extract-Core in terms of the sum of objective coefficients. In practice, however, it is commonly acknowledged by Max SAT solver developers that implementations of binary search are often outperformed by implementations of solution-improving search. This is due to the fact that the intermediate calls to Extract-Core that report res= false can often be challenging when solving realworld instances. Ihalainen, Berg, & J arvisalo Algorithm 2 The core-guided approach to Max SAT Input: A Max SAT instance F = (F, O). Output: An optimal solution τ to F. 1: Constraints = 3: while true do 4: γA = { x | x var(OR)} 5: (res, C, τ) = Extract-Core(F Constraints, γA) 6: if res = true then return τ 7: (D, out) = Generate-Cardinality-Constraints(C) 8: Constraints = Constraints D 9: OR = Refine-Objective(C, out, OR) that enables (at most) one variable in core C to incur cost in subsequent iterations. This is achieved by adding a cardinality constraint over the core to Constraints (Lines 7 8) and updating the current working objective (Line 9). Conceptually, modern core-guided algorithms differ mainly in the specifics of the corerelaxation step. We detail the relaxation of the core-guided OLL algorithm (Andres et al., 2012; Morgado et al., 2014) as arguably one of the most successful core-guided approaches. In OLL, an invocation of Generate-Cardinality-Constraints(C) returns a set of cardinality constraints D = {as CNF(P x C x j o C j ) | 2 j |C|} and a set out = {o C 2 . . . o C |C|} of output variables. Intuitively, as enforcing o C k to 0 limits the number of literals in C assigned to 1 to at most k, the new cardinality constraints define output variables that count the number of literals in C assigned to 1 in subsequent iterations. The output variable with index 1 is not introduced, since the fact that C is a core implies that every solution to the instance assigns at least one literal to 1. In the objective reformulation step (Refine-Objective procedure in Algorithm 2) OLL adds the newly-introduced outputs to the objective in a way that preserves the set of optimal solutions. The coefficient of each x C is decreased by w C = minx Ci{OR(x)}, removing from OR every literal whose coefficient decreases to 0. The coefficient of each output variable in out is set to w C and the constant term of OR is increased by w C. During the reformulation step, the coefficient of at least one variable in C decreases to 0. Thus, at least one more literal may incur cost in subsequent iterations. Example 3. Invoke OLL on F = (F, O) from Example 1. The first call to Extract-Core is under the assumptions γA = { b1, b2, b3, b4, b5}. Let the first core obtained be C1 = (b1 b2 b3 b4 b5). Relaxing C1 introduces the cardinality constraint as CNF(P x C1 x i o1 i ) for i = 2, 3, 4, 5. The new objective OR is formed based on the following observations: (i) as C1 is a core, any solution to F assigns one literal in C1 to 1 and as such incurs 1 cost in O, (ii) each additional literal of C1 assigned to 1 should incur precisely 1 more cost. The new objective is OR = 2b3 + b5 + o1 2 + o1 3 + o1 4 + o1 5 + 1. Notice how observation (i) results in the addition of a constant term 1 and observation (ii) in the addition of the outputs of the new cardinality constraint to the objective. Unifying SAT-Based Approaches to Maximum Satisfiability Solving Algorithm 3 The implicit hitting set approach to Max SAT Input: A Max SAT instance F = (F, O). Output: An optimal solution τ to F. 2: while true do 3: γA = { x | x var(O) \ Mincost-HS(K)} 4: (res, C, τ) = Extract-Core(F, γA) 5: if res = true then return τ 6: else K = K C The next call to Extract-Core is under the assumptions γA = { b3, b5, o1 2, o1 3, o1 4, o1 5}. Let the next core obtained be C2 = (o1 2 b3). Relaxing C2 introduces the cardinality constraint as CNF(P x C2 x 2 o2 2) and the new objective OR = b3+b5+o1 3+o1 4+o1 5+o2 2+2. The set of assumptions in the third call to Extract-Core is γA = { b3, b5, o1 3, o1 4, o1 5, o2 2}. One potentially obtained assignment is now τ = {b1, b2, x, b3, b4, b5} that also assigns {o1 2, o1 3, o1 4, o1 5, o2 2}. The assignment is returned as an optimal solution to the input instance F. 3.3 Implicit Hitting Set Approach A generic abstraction of the implicit hitting set (IHS) approach to Max SAT is outlined as Algorithm 3. IHS iteratively extracts cores of a given Max SAT instance F = (F, O) and stores them in the set K. In contrast to core-guided algorithms, instead of reformulating the objective after each core-extraction step, IHS invokes the Mincost-HS(K) procedure that computes a minimum-cost hitting set (MCHS) over K under O. Here an MCHS is a minimum-cost (in terms of O) subset hs of the objective variables such that by assigning the variables in hs to 1 all cores in K are satisfied. In each iteration of the main loop (Lines 2 6), Extract-Core is queried for a solution that falsifies all objective variables that are not contained in the hs computed over the current set of cores (Lines 3 4). (Note that the assumptions γA set up on Line 3 constitute a partial assignment over the objective variables that can be extended to a solution to K in a unique way.) If there is such a τ, it is an optimal solution to the input instance (Line 5). Otherwise, a new core is obtained and added to K (Line 6). The MCHS computed in each iteration represents a way of satisfying all cores found so far in an optimal way under O, this giving a lower bound on optimal cost of F. IHS iterates until the most recent MCHS can be extended to a solution to F. At which point, the solution satisfies ( hits ) all cores not only those currently accumulated in K of the instance, and is thereby an optimal solution to F. Example 4. Invoke Algorithm 3 on the Max SAT instance F = (F, O) from Example 1. In the first iteration there are no cores and hence Mincost-HS(K) = . The first call to Extract-Core is under the assumptions γA = { b1, . . . , b5}. There are a number of cores that could be returned; let the first core obtained be C1 = (b1 b2 b3 b4 b5). In the second iteration, there are three different MCHSes over K = {C1}. Assume that Mincost-HS returns {b1}. The assumptions for the next call to Extract-Core are γA = { b2, b3, b4, b5}. Assume that the next core obtained is C2 = (b3 b4 b5). In this third iteration, the only MCHS over K = {C1, C2} is {b4}. The assumptions for the next call to Extract-Core Ihalainen, Berg, & J arvisalo are γA = { b1, b2, b3, b5}. Assume that the next core is C3 = (b1 b2 b3). In this fourth iteration, there are two possible MCHSs over K = {C1, C2, C3}. Assume that Mincost-HS returns {b2, b4}. This leads to the assumptions γA = { b1, b3, b5}. Given these assumptions, Extract-Core returns the solution τ = { b1, b2, x, b3, b4, b5} as an optimal solution to F. 4. Uni Max SAT: A General Framework for SAT-Based Max SAT Algorithms As the main contribution of this article, we present Uni Max SAT as a general algorithmic framework unifying SAT-based Max SAT algorithms. The framework makes use of the notion of abstract cores originally proposed as a basis for a refinement of IHS (Berg et al., 2020). Here, going considerably beyond their original intended purpose, we build on abstract cores to obtain a framework that captures SAT-based Max SAT algorithms in general terms. 4.1 Abstraction Sets and Abstract Cores We start by defining abstraction sets and abstract cores. On a high level, abstraction sets and abstract cores of a Max SAT instance capture generic properties of the instance compactly in the sense that a large number of standard cores would be needed to express the same properties (Berg et al., 2020). Informally speaking, an abstraction set models a relationship between a set of input literals in and a set out of output literals via a CNF formula D. In typical practical instantiations, the formula D corresponds to a cardinality constraint that essentially counts the number of input literals assigned to 1 by satisfying assignments of D, assigning the kth output literal to 1 if and only if k input literals are assigned to 1. In the following, we give a more general definition that is sufficient for proving the correctness of Uni Max SAT. Definition 1. An abstraction set ab = (in, D, out) consists of a set in of input literals, a set out of output literals, and a satisfiable CNF formula D over a superset of the set of literals in out, i.e., it holds that var(in out) var(D). Solutions to D are uniquely defined by assignments to the inputs: for any assignment τ over in there is exactly one extension τ E τ that satisfies D. For a given abstraction set ab = (in, D, out), we refer to D as the definitions of the outputs out. For a collection AB = {(ini, Di, outi) | i = 1, . . . , n} of abstraction sets, the CNF formula DEF(AB) = Sn i=1 Di is the conjunction of the definitions in AB, OUTS(AB) = Sn i=1 outi is the set of outputs occurring in AB, and INPUTS(AB) = Sn i=1 ini is the set of inputs occurring in AB. We say that AB is feasible for a Max SAT instance F = (F, O) if DEF(AB) does not change the set of solutions to F, i.e., if every solution τ to F can be extended to a solution τ E τ to F DEF(AB). We will only consider collections of abstraction sets that are feasible for Max SAT instances at hand. An abstract core of a Max SAT instance F = (F, O) is a clause that is logically entailed by F and the definitions of some feasible collection of abstraction sets. Importantly, an abstract core can contain both objective variables and outputs of abstraction sets. Definition 2. For a Max SAT instance F = (F, O) and collection AB of feasible abstraction sets, a clause C is an abstract core of F wrt AB if (i) var(C) (var(O) var(OUTS(AB)) and Unifying SAT-Based Approaches to Maximum Satisfiability Solving (ii) τ(C) = 1 for each solution τ to F DEF(AB). Every (standard) core of a Max SAT instance F is also an abstract core of F with respect to any collection of feasible abstraction sets. Example 5. Consider the Max SAT instance F = (F, O) from Example 1 and the abstraction set ab = ({b1, b2, b3, b4, b5}, {as CNF( X 1 i 5 bi j oj) | j = 2, 3, 4, 5}, {o2, o3, o4, o5}). We have that C = (o2 b3) is an abstract core of F as any assignment that satisfies F DEF(ab) must assign either b3 = 1 or at least two variables of {b1, b2, b3, b4, b5} to 1, thereby forcing o2 = 1. Note how the abstract core C in Example 5 corresponds to the core C2 in Example 3. This demonstrates how cores of the reformulated instance extracted by OLL can be viewed as abstract cores of the original instance.4 The Uni Max SAT framework is based on computing minimum-cost solutions to abstract cores and extending them to a solution to the Max SAT instance at hand. To differentiate solutions to an input Max SAT instance from solutions to cores, we call solutions to a set of abstract cores candidate solutions (or candidates for short). More precisely, for a Max SAT instance F = (F, O), a collection AB of feasible abstraction sets and a set K of abstract cores, an assignment δ that satisfies K DEF(AB) and assigns each variable in var(O) is a (K, AB) candidate of F and has cost O(δ). A (K, AB) candidate δ is minimum-cost if O(δ) O(δ ) for all (K, AB) candidates δ . Abstraction sets and abstract cores are employed in the Uni Max SAT framework for computing lower bounds (which are in turn used to prove the optimality of solutions) based on the following proposition. Proposition 1. Let F = (F, O) be a Max SAT instance, AB a set of feasible abstraction sets, K a set of abstract cores of F wrt AB, and δ a minimum-cost (K, AB) candidate. Then O(δ) opt(F). Proof. Consider an arbitrary solution τ to F. By feasibility of AB there is an extension δτ τ which is a solution to F DEF(AB) and for which O(δτ) = O(τ). By the definition of abstract cores, δτ is a (K, AB) candidate. Since δ is minimum-cost, we have that O(τ) = O(δτ) O(δ). As τ is an arbitrary solution to F, we conclude that opt(F) O(δ). A simple corollary to Proposition 1 is that any assignment τ that extends a minimumcost (K, AB) candidate δ and satisfies F is an optimal solution to F. The Uni Max SAT framework we develop works intuitively by iteratively computing minimum-cost (K, AB) candidates for an increasing set AB and K of feasible abstraction sets, and abstract cores, respectively, and checking whether they can be extended to solutions to the whole instance. Each check either determines that an optimal solution has been found or provides a new 4. Viewing cores of the reformulated instance during OLL search as abstract cores bears resemblance to previous work on analyzing core-guided solvers in which cores of the original instance are separated from cores of the reformulated instance ( metas ) (Narodytska & Bjørner, 2022; Katsirelos, 2023). Ihalainen, Berg, & J arvisalo abstract core that is falsified by the current (K, AB) candidate. In the latter case, the new abstract core is an explanation for why the current (K, AB) candidate can not be extended to an optimal solution of the instance at hand. While this is similar to a correctness proof for basic IHS search (see, e.g., (Davies & Bacchus, 2011)), the generality of Uni Max SAT allows for capturing also other types of SAT-based Max SAT algorithms. Intuitively, the ability of Uni Max SAT to simulate core-guided algorithms follows from (i) the use of abstract cores and (ii) the fact that Uni Max SAT rules out not only complete (K, AB) candidates but also partial assignments that extend solely to minimum-cost (K, AB) candidates. The following notion of a (minimum-cost) (K, AB) abstract candidate is central for establishing the correctness and generality of Uni Max SAT. Definition 3. Let F = (F, O) be a Max SAT instance, AB a collection of feasible abstraction sets and K a set of abstract cores wrt to AB. A partial assignment γA over a subset of the variables in var(K) var(O) is a (K, AB) abstract candidate if (i) there is at least one extension τ γA which is a solution to DEF(AB) K, i.e., a (K, AB) candidate of F, and (ii) all such extensions are minimum-cost (K, AB) candidates. Example 6. Consider the Max SAT instance F = (F, O) from Example 1, the empty collection AB = of abstraction sets and the set K = {(b1 b2 b3), (b3 b4 b5)} of abstract cores. One minimum-cost (K, AB) candidate is δ = { b1, b2, b3, b4, b5}, and one (K, AB) abstract candidate is γA = { b1, b3, b5} since the only extension of γA to a solution to K is δ. The set { b1, b3} is not a (K, AB) abstract candidate since it extends to the (K, AB) candidate { b1, b2, b3, b4, b5} to K which is not minimum-cost. An important insight is that the assumptions enforced during the iterations of a coreguided algorithm can be seen as (K, AB) abstract candidates of the set AB of abstraction sets that corresponds to the cardinality constraints added by the core-guided algorithm and the set K of cores of the reformulated instance extracted by the algorithm. The following example illustrates this for the OLL algorithm (we will detail other core-guided algorithms in Section 5). Example 7. Recall the Max SAT instance F = (F, O) from Example 1. Consider the core C = (b1 b2 b3 b4 b5), and the abstraction set ab = (in, D, out) with in = {b1, b2, b3, b4, b5}, out = {o C 2 , o C 3 , o C 4 , o C 5 } and D = {as CNF(P x C x i o C i ) | i = 2, 3, 4, 5}. Let K = {C} and AB = {ab}. The set γA = { b3, b5, o C 2 , o C 3 , o C 4 , o C 5 } is a (K, AB) abstract candidate since it can be extended to a solution to K DEF(AB) by assigning exactly one literal in {b1, b2, b4} to 1 and the rest to 0. Note that γA is exactly the set of assumptions that Extract-Core is queried under during the second iteration of the OLL invocation detailed in Example 3. 4.2 Uni Max SAT in Detail With the necessary preliminaries in place, we now detail the Uni Max SAT framework. A high-level view to the framework is shown in Figure 1, and the framework is detailed in Unifying SAT-Based Approaches to Maximum Satisfiability Solving Algorithm 4 Uni Max SAT, a unifying framework for SAT-based Max SAT algorithms Input: A Max SAT instance F = (F, O). Output: An optimal solution τ to F. 1: AB1 = , K1 = 2: for i = 1 . . . do 3: (γA i , lbi) = Optimize(O, ABi, Ki) 4: (res, Ci, τ) = Extract-Abstract Core(F, DEF(ABi), γA i ) 5: if res = true and O(τ) = lbi then return τ 6: Ki+1 = {Ci} Ki 7: ABi+1 = ABi Add-Abstraction Sets(Ki+1) pseudo-code as Algorithm 4. Given a Max SAT instance F = (F, O) as input, Uni Max SAT outputs an optimal solution to F.5 Uni Max SAT accumulates two sets, AB and K, of abstraction sets and abstract cores, respectively. In each iteration, the Optimize subroutine computes an assignment γA over OUTS(AB) var(O) and a lower bound lb for the optimal cost of F. The subroutine Extract-Abstract Core is invoked to check for an extension of γA to a solution to F. If such an extension τ exists (i.e., if res = true ), Uni Max SAT checks if the cost of τ matches lb. If this is the case, Uni Max SAT terminates and returns τ as an optimal solution. Otherwise, a new abstract core falsified by γA is obtained and added to K. To ensure termination, we require that the γA returned by Optimize must correspond to a (K, AB) abstract candidate sufficiently often. Furthermore, when γA is a (K, AB) abstract candidate, the lower bound lb returned by Optimize must equal to the costs of its 5. We note that the framework as presented here is a significant modification of the framework described in the preliminary version of this article (Ihalainen et al., 2023). In particular, the correctness of the present version does not require computing an abstract candidate in every iteration, only that each iteration is succeeded by another one on which an abstract candidate is computed. Compared to the preliminary version, this modification allows not only to further capture objective-bounding search algorithms but also more directly capture core-guided search algorithms that do not compute abstract candidates in every iteration. Optimize(O, AB, K) Extract-Abstract Core(F, DEF(AB), γA) Add-Abstraction Sets(F, K) An assignment: γA A lower bound: lb An abstract core: C The set of cores: K The set of abstraction sets: AB Figure 1: A schematic overview of Uni Max SAT, invoked on a Max SAT instance (F, O). Ihalainen, Berg, & J arvisalo extensions. Since all such extensions are minimum-cost, it follows that, whenever Optimize computes a (K, AB) abstract candidate, the lower bound Optimize returns is as high as possible in terms of the currently accumulated set of cores. Note that Optimize does not need to identify that the assignment it computes is a (K, AB) abstract candidate. (The identification of (K, AB) abstract candidate could be computationally challenging for many practical instantiations and is in fact not required for the correctness of Uni Max SAT.) We formalize the correctness of Algorithm 4 in the following terms. Uni Max SAT terminates on any Max SAT instance and outputs an optimal solution to the input Max SAT instance at hand, subject to the generic properties of its three subroutines. Importantly, the correctness of the general framework allows for establishing the correctness of any of its instantiations including variants of objective-bounding, core-guided and IHS algorithms for Max SAT by showing how each algorithm can be viewed as an instantiation of Uni Max SAT. First, we establish general conditions that instantiations of Optimize need to meet in order to guarantee that Uni Max SAT correctly computes an optimal solutions to an arbitrary input Max SAT instance. Definition 4 (Correctness condition). An instantiation of Optimize satisfies the correctness condition if the following conditions hold at every iteration i of Uni Max SAT when invoked on an arbitrary input Max SAT instance F = (F, O). 1. γA i assigns a subset of the variables in var(O) OUTS(ABi). 2. lbi is a lower bound on the optimal cost of F, i.e., lbi opt(F). 3. If γA i is a (Ki, ABi) abstract candidate, then lbi is equal to the cost of an extension of γA i to a (Ki, ABi) candidate of F. 4. There exists an r 0 such that Optimize returns a (Ki+r, ABi+r) abstract candidate in iteration i + r. In words, condition 1 ensures that in the iterations in which Extract-Abstract Core determines the instance to be unsatisfiable, an abstract core of the instance is obtained. Condition 2 ensures that Uni Max SAT does not terminate before finding an optimal solution, while conditions 3 and 4 ensure that termination takes place eventually. The following main theorem formalizes this intuition and establishes generic conditions that the other subroutines of Uni Max SAT need to satisfy to ensure the correctness of Uni Max SAT. Theorem 1. Assume that the following three properties hold in every iteration i of Uni Max SAT on an input Max SAT instance F = (F, O) that has a solution. 1. Optimize satisfies the correctness condition (Definition 4). 2. Extract-Abstract Core(F, DEF(ABi), γA i ) computes either a solution τ γA i to F DEF(ABi) or a(n abstract) core Ci that is satisfied by all solutions to F DEF(ABi) and falsified by γA i . 3. ABi is feasible for F. Unifying SAT-Based Approaches to Maximum Satisfiability Solving Then Uni Max SAT terminates and returns an optimal solution to F. The formal proof of Theorem 1 relies on the following lemma stating that, in each iteration i in which a (Ki, ABi) abstract candidate of F is computed for the current set ABi and Ki of abstraction sets and abstract cores, respectively, the set of assignments from which Optimize will return an assignment shrinks. Lemma 1. Invoke Uni Max SAT on a Max SAT instance F = (F, O) and consider an iteration i. Let Ki and ABi be the set of abstract cores and abstraction sets obtained so far, respectively, and denote by obj-solsi the restrictions of all solutions to Ki DEF(ABi) onto var(O). Assume Uni Max SAT does not terminate on iteration i and Optimize computes an (Ki, ABi) abstract candidate of F. Then obj-solsi+1 obj-solsi. Proof. The fact that every element in obj-solsi+1 is also an element of obj-solsi follows from the fact that the sets of abstract cores and abstraction sets monotonically increase during the execution of Uni Max SAT. To show that there is a τ o obj-solsi\obj-solsi+1, consider the (Ki, ABi) abstract candidate γA i and the abstract core Ci computed in iteration i. By definition, there is a minimum-cost (Ki, ABi) candidate δ γA i Ci that falsifies Ci. Let τ o be the restriction of δ onto the objective variables var(O). The claim of the lemma is equivalent to the claim that there is no extension of τ o to a solution to DEF(ABi) that satisfies Ci. As AB is feasible, there is exactly one way of extending τ o to a solution to DEF(ABi). Since δ is such an extension and Ci δ, we conclude that τ o cannot be extended to a solution to DEF(ABi) that satisfies Ci. We are now ready to give a proof of Theorem 1. Proof of Theorem 1. First note that by assumption 1 of the theorem and assumption 1 of the correctness condition, whenever Extract-Abstract Core reports that the current assignment γA i is not extendable to a solution on Line 4 of Algorithm 4, the clause Ci returned by Extract-Abstract Core is an abstract core of the instance at hand with respect to the current set of abstraction sets. As the sets of abstraction sets monotonically increase, ABi ABi+1 holds for all i during the execution of Uni Max SAT. We conclude that, in each iteration i of Uni Max SAT, all clauses in Ki are abstract cores of F wrt ABi. Optimality of returned solutions. Assume that Uni Max SAT terminates in iteration i and returns a solution τ. As O(τ) = lb opt(F) O(τ) it follows that O(τ) = opt(F). Termination. Given that F has solutions and ABi is feasible, F DEF(ABi) has a solution for each i. By the definition of abstract cores, all solutions to F DEF(ABi) are solutions to DEF(ABi) Ki. Thus termination follows by showing that Optimize will eventually return a (Ki, ABi) abstract candidate γA i that can be extended to a solution τ to F DEF(ABi). This in turn follows from the number of solutions to DEF(ABi) Ki, in particular, from the fact that each new core rules out at least one of the finitely many solutions that Optimize may return. More specifically, whenever Optimize returns a (Ki, ABi) abstract candidate and Uni Max SAT does not terminate, the number of assignments to var(O) that can be extended to solutions to the cores decreases by Lemma 1. As Optimize satisfies the correctness condition, the sequence of iterations in which it returns (Ki, ABi) abstract candidates is infinite. This implies that eventually a (Ki, ABi) abstract candidate can be extended to a solution to F DEF(ABi). Ihalainen, Berg, & J arvisalo Uni Max SAT (Algorithm 4, Theorem 1) IHS with abstract cores Objective-bounding search Solution-improving search Linear UNSAT/SAT search Binary search Progression-based search Core-guided instantiations (Definition 5) Cardinality-based CG (Definition 6) Figure 2: The structure of results of Sections 5 7. In the figure, an arrow A B from algorithm A to B indicates that A can be seen as a special case of B. 5. Capturing SAT-based Max SAT Algorithms with Uni Max SAT In this and the following section, we detail how existing SAT-based Max SAT algorithms can be viewed as instantiations of Uni Max SAT. Specifically, for the existing objectivebounding search, IHS, and several variants of core-guided search, we explain how to instantiate the three subroutines of Uni Max SAT under the assumptions of Theorem 1 so that the resulting instantiation matches the previously proposed algorithm. By Theorem 1, this yields uniform proofs of correctness for IHS (Davies & Bacchus, 2013, 2011) (including its abstract-cores extension (Berg et al., 2020)), objective-bounding search algorithms (including solution-improving search (E en & S orensson, 2006; Berre & Parrain, 2010; Koshimura et al., 2012; Paxian et al., 2018)) and modern core-guided algorithms such as OLL (Andres et al., 2012; Morgado et al., 2014), MSU3 (Marques-Silva & Planes, 2007), WPM3 (Ans otegui & Gab as, 2017), PMRES (Narodytska & Bacchus, 2014) and K (Alviano et al., 2015). In addition to proving the correctness of existing algorithms, we will furthermore outline in Section 7 a proof-of-concept novel core-guided algorithm and establish its correctness via Uni Max SAT. Figure 2 provides a road map for Sections 5 7. We start with general observations on the extraction of abstract cores and feasibility of abstraction sets in Section 5.1, and then proceed in Sections 5.2 5.3 by detailing how IHS, IHS with abstract cores, and objectivebounding search algorithms can be viewed as direct instantiations of Uni Max SAT in a way that satisfies the assumptions of Theorem 1. Section 6 is dedicated to viewing coreguided algorithms via Uni Max SAT. For capturing the core-guided algorithms, we will define a generic core-guided instantiation of Uni Max SAT (Definition 5) that captures general properties of core-guided algorithms sufficient for obtaining correct instantiations of Uni Max SAT (as established by Theorem 2). We further introduce the notion of a Unifying SAT-Based Approaches to Maximum Satisfiability Solving cardinality-based CG instantiation of Uni Max SAT (Definition 6) that intuitively models core-guided algorithms that add a single cardinality constraint for each extracted core into the instance. We will show that all cardinality-based CG instantiations are also core-guided instantiations (Theorem 3), thereby establishing that cardinality-based CG instantiations are also correct instantiations of Uni Max SAT that satisfy Theorem 1. Turning to individual existing core-guided algorithms, we will show in Section 6.3 that the OLL, PMRES, K, WPM3 and MSU3 algorithms are cardinality-based CG instantiations of Uni Max SAT, thereby also establishing the correctness of each of the algorithms in terms of Theorem 1. Finally, to further demonstrate the usefulness of the hierarchy depicted in Figure 2 for defining new algorithms, we detail the novel Abst CG algorithm in Section 7 and show that it is a core-guided instantiation of Uni Max SAT but not a cardinality-based CG instantiation. 5.1 Extraction of Abstract Cores and Feasibility of Abstraction sets For all specific instantiations of Uni Max SAT we discuss in Sections 5.2, 5.3, 6, and 7 the Extract-Abstract Core subroutine of Uni Max SAT is assumed to be core-extracting SAT solver. Given a Max SAT instance F = (F, O), a feasible collection AB of abstraction sets and an assignment γA, Extract-Abstract Core invokes a SAT solver on F DEF(AB) under the assumptions γA. The search returns either a solution to F DEF(AB) that extends γA, or a clause entailed by F DEF(AB) that is falsified by γA. Assumption 2 of Theorem 1 on the Extract-Abstract Core subroutine follows directly from established properties of incremental SAT solvers (E en & S orensson, 2003; Audemard et al., 2013) instantiating the CDCL SAT solving paradigm (Silva & Sakallah, 1999; Zhang et al., 2001; Marques-Silva et al., 2021). The feasibility of all abstraction sets computed i.e., assumption 3 of Theorem 1 follows from the fact that the definitions of every new abstraction set computed only intersect with the Max SAT instance and previous abstraction sets on the inputs. More precisely, the abstraction sets computed in every instantiation of Add-Abstraction Sets we consider satisfy the assumptions of the following lemma. Lemma 2. Consider a Max SAT instance F = (F, O) and a set of feasible abstraction sets AB. Let (in, D, out) be an abstraction set and assume var(D) var(F DEF(AB)) in. Then AB {(in, D, out)} is feasible for F. Proof. Let τ be a solution to F. By the feasibility of AB, there is a solution τ e τ to F DEF(AB). Since the only variables of D assigned by τ e are in in, by Definition 1 there is a solution τ E τ e to D. Such a τ E is a solution to F DEF(AB {(in, D, out)}), establishing the feasibility of AB {(in, D, out)}. With these considerations, we will from now on assume all abstraction sets to be feasible and the Extract-Abstract Core to be instantiated with a SAT solver. 5.2 Capturing IHS with Uni Max SAT Uni Max SAT gives the (basic) IHS (Algorithm 3) through the following instantiation. Instantiate Add-Abstraction Sets to never add any abstraction sets, i.e., so that ABi = for all i. Further, instantiate Optimize as a procedure that, given a set K of cores, returns the tuple (γA, lb) where γA = { x | x var(O) \ Mincost-HS(K)} is an assignment Ihalainen, Berg, & J arvisalo that sets all literals in the objective to 0 except the ones in the most recent minimumcost hitting set Mincost-HS(K) over K. The value of lb is the sum of the coefficients of the variables in the minimum-cost hitting set. The correctness of Algorithm 3 now follows by Theorem 1 by observing that the only extension of γA to a (K, ) candidate is δ = { x | x var(O) \ Mincost-HS(K)} {x | x Mincost-HS(K)}; this candidate is minimum-cost and has O(δ) = lb. Hence γA is a (K, ) abstract candidate and lb a lower bound on the optimal cost by Proposition 1. Example 8. Invoke the instantiation of Uni Max SAT that simulates the basic IHS algorithm on Max SAT instance F = (F, O) from Example 1. In the first iteration (i = 1), we have K1 = and AB1 = . As such the call Optimize(O, AB1, K1) returns γA 1 = { b1, . . . b5} and lb1 = 0. The subsequent invocation of Extract-Abstract Core(F, DEF(AB1), γA 1 ) returns res = false and (for example) C1 = (b1 b2 b3 b4 b5). The first iteration ends with Uni Max SAT setting K2 = {C1} and AB2 = . In the second iteration, the call Optimize(O, AB2, K2) may return an (K2, AB2)-abstract candidate corresponding to any of the three MCHSes over K2. Suppose that it returns γA 2 = { b2, . . . b5} and lb2 = 1. Then the call Extract-Abstract Core(F, DEF(AB2), γA 2 ) again returns res = false and, e.g., C2 = (b3 b4 b5). The sets K3 and AB3 are set to {C1, C2} and , respectively. In the third iteration, Optimize computes {b4} as the (only) MCHS over K3 and returns γA 3 = { b1, b2, b3, b5} and lb3 = 1. Notice that γA 3 is the only (K3, AB3)-abstract candidate at this stage. The call Extract-Abstract Core(F, DEF(AB3), γA 3 ) returns res = false and, e.g. C3 = (b1 b2 b3). In the fourth iteration there are two possible MCHSes over K4 = {C1, C2, C3} that both correspond to (K4, AB4)-abstract candidates. Assume that Optimize(O, AB4, K4) returns γA 4 = { b2, b3, b5} and lb4 = 2. Then the call Extract-Abstract Core(F, DEF(AB4), γA 4 ) returns res = true and the solution τ = {b1, b2, b3, b4, b5, x}. As O(τ) = 2 = lb4, Uni Max SAT terminates and returns τ as an optimal solution of F. Abstract cores. Uni Max SAT gives IHS enhanced with abstract cores by instantiating Add-Abstraction Sets to (heuristically) compute abstraction sets (in, D, out), where the inputs in = {x1, . . . , xn} are a subset of n objective variables that all have the same coefficient in O, out = {o1, . . . , on} is a set of n new variables, and D = {as CNF(P x in x i oi) | k = 1 . . . n}. Informally speaking, the outputs of the added abstraction sets count the number of inputs assigned to 1 in all satisfying assignments. The instantiation of the Optimize subroutine first computes a minimum-cost solution γ to K DEF(AB) that assigns all variables in var(O) OUTS(AB), and then returns either γA 1 as the restriction of γ onto the objective variables var(O) assigned to 0, or γA 2 as the restriction of γ onto the outputs of the current abstraction sets and objective variables that are not inputs to any abstraction sets assigned to 0. More precisely, if INPUTS(AB) is the set of all variables that are inputs to some abstraction set, then γA 2 assigns to 0 all variables OUTS(AB) (var(O) \ INPUTS(AB)) that are assigned to 0 by γ and does not assign any other variables. In both cases, Optimize also returns the cost of γ as lb. The correctness of IHS enhanced with abstract cores now follows by Theorem 1 by showing that Optimize satisfies the correctness condition. This in turn follows directly from showing that both γA 1 and γA 2 are (K, AB) abstract candidates since then γ is a Unifying SAT-Based Approaches to Maximum Satisfiability Solving minimum-cost (K, AB) candidate, which by Proposition 1 implies that lb = O(γ) is a lower bound on the optimal cost. The fact that γA 1 is a (K, AB) abstract candidate follows since the only extension of γA 1 to a solution to K DEF(AB) is γ. Further, γA 2 is a (K, AB) abstract candidate since (i) γ is an extension of γA 2 to a solution to K DEF(AB) and (ii) every such extension sets equally many inputs of each abstraction set to 1, thus incurring exactly the same cost (since the inputs to each abstraction set have the same coefficient in O). 5.3 Capturing Objective-Bounding Search with Uni Max SAT To capture objective-bounding search by Uni Max SAT, Add-Abstraction Sets is instantiated to compute a single abstraction set ab = (var(O), D, out), where all of the variables in the objective occur as inputs, there is an output ow for every w = 1, . . . , W, where W is the sum of coefficients of O = P i wibi, and where definitions are D = {as CNF( X wibi w ow) | w = 1 . . . W} that informally speaking count the sum of coefficients of objective variables set to 1 by satisfying assignments, i.e., the costs of satisfying assignments. The instantiations of Optimize for capturing different objective-bounding search algorithms follow by noting that, for any solution τ to F DEF({ab}), the cost of τ can be read from the outputs of ab, τ(ok) = 0 if and only if O(τ) < k holds for all solutions τ to F DEF({ab}). Similarly, abstract cores of this instance map to lower bounds on the optimal cost, the unit clause (ok) is an abstract core if and only if k < opt(F). Thus solution-improving, UNSAT-SAT, binary, and progression-based search are all obtained by an instantiation of Optimize that returns assignments that set a single output variable to 0 and a lower bound equal to the largest index w for which (ow) has been determined to be an abstract core. Termination occurs once the Extract-Abstract Core subroutine has determined (oopt(F)) to be an abstract core and { oopt(F)+1} a (K, {ab}) abstract candidate for the set K of cores obtained so far. As a concrete example, to capture solution-improving search, Optimize returns in the first iteration the assignment ( o W ) and lb = 0. In subsequent iterations Optimize returns ( o O(τ) 1) and lb = 0 where O(τ) is the cost of the latest solution computed by Extract-Abstract Core. When Extract-Abstract Core reports unsatisfiability, an abstract core (ow) is obtained. Correctness of solution-improving search in terms of Theorem 1 follows from the fact that the assignment γA = ( ow+1) is a ({(ow)}, {ab}) abstract candidate the extensions of which have cost lb = w = opt(F). 6. Capturing Core-Guided Search with Uni Max SAT We turn to detailing how various modern core-guided algorithms can be viewed as instantiations of Uni Max SAT. Key to viewing any core-guided algorithm through Uni Max SAT is to view the cardinality constraints a core-guided algorithm introduces as abstraction sets, and the reformulation of the objective as an implicit computation of a (K, AB) abstract candidate for the instance, where the set K and AB are the abstract cores and abstraction sets computed so far, respectively. Ihalainen, Berg, & J arvisalo 6.1 Capturing Generic Properties of Core-Guided Algorithms Compared to the IHS and objective-bounding search algorithms, significantly more variants of core-guided algorithms have been proposed, differing in the specifics of how the corerelaxation steps are performed. We will in the following identify properties of core-relaxation steps shared by all core-guided algorithms we consider. These general properties allow for a more generic proof of correctness for core-guided algorithms via viewing the algorithms as instantiations of Uni Max SAT. Definition 5. Consider the ith iteration of Uni Max SAT when invoked on a Max SAT instance F = (F, O). Let AB and K be the accumulated set of abstraction sets and abstract cores, respectively. We say that an instantiation of Uni Max SAT is a core-guided instantiation if the following properties hold. Extract-Abstract Core is a core-extracting SAT-solver and Add-Abstraction Sets introduces feasible abstraction sets. Optimize maintains a reformulated objective function OR and in each iteration returns as assumptions the assignment { x | x var(OR)} that sets all of the variables in OR to 0. It also returns the constant term of OR as the lower bound on the optimal cost. In each iteration we have OR(τ) = O(τ) for any solution τ to DEF(AB) K. Given a core, Optimize reformulates OR in a way that increases the constant term of OR. We will now show that any core-guided instantiation of Uni Max SAT correctly computes optimal solutions to Max SAT instances. Theorem 2. For any input Max SAT instance F that has a solution, any core-guided instantiation of Uni Max SAT terminates and returns an optimal solution to F. We prove Theorem 2 by showing that a core-guided instantiation of Uni Max SAT satisfies the assumptions of Theorem 1. The non-trivial part of the proof deals with arguing that Optimize satisfies the correctness condition (Definition 4). In the following lemmas, we consider the ith iteration of a core-guided instantiation of Uni Max SAT when invoked on a Max SAT instance F = (F, O). Let ABi and Ki be the set of abstraction sets and abstract cores collected, respectively; OR i the reformulated objective maintained by Optimize; and W lb i the constant term of OR i . The following two observations follow directly from the definition of core-guided instantiations. Observation 1. W lb i is a lower bound on opt(F). Observation 2. For any solution γ { x | x var(OR i )} of DEF(ABi) Ki, we have O(γ) = W lb i . What remains to be shown is that the assignment γA i = { x | x var(OR i )} computed by Optimize will be a (Ki+r, ABi+r) abstract candidate in iteration i + r for some r 0. We first establish that whenever γA i can be extended to a solution to DEF(ABi) Ki, any extension of γA i will be minimum-cost. Unifying SAT-Based Approaches to Maximum Satisfiability Solving Lemma 3. Assume that the assignment γA i = { x | x var(OR i )} can be extended to a solution to DEF(ABi) Ki. Then γA i is a (Ki, ABi) abstract candidate. Proof. Consider an extension τ of γA i to a solution to DEF(ABi) Ki. We show that τ is minimum-cost. Since τ(x) = 0 for all x var(OR i ), we have that OR i (τ) = W lb i . By the definition of core-guided instantiations, this implies O(τ) = W lb i . The fact that τ is minimum-cost follows from observing that O(δ) = OR i (δ) W lb i holds for all solutions δ to DEF(ABi) Ki. Next we show that the assignment returned by Optimize can be extended to a solution of the abstract cores and the definitions at some future iteration. Lemma 4. Assume F has a solution. There is an r 0 such that the assignment γA i+r = { x | x var(OR i+r)} can be extended to a solution to DEF(ABi+r) Ki+r. Proof. Assume for contradiction that γA k cannot be extended to a solution to DEF(ABk) Kk for any k i. Then in each iteration k i Extract-Abstract Core will return a core. Let τ be an optimal solution to F and, for a fixed k, τ E k its extension to a solution of DEF(ABk) Kk. By the properties of core-guided instantiations, we have O(τ ) = OR k (τ E k ). Furthermore, the constant term W lb of the reformulated objective increases in each iteration when a core is obtained. Thus eventually for some iteration k we have W lb k = OR k (τ E k ) = O(τ ). (W lb k > OR k (τ E k ) is not possible, since W lb k is a lower bound for OR k (τ) for any solution τ.) Now OR k (τ E k ) = W lb k implies τ E k { x | x var(OR k )}. Thus { x | x var(OR k )} can be extended to a solution to DEF(ABk ) Kk , contradicting the initial assumption. A simple corollary of the Lemmas 3 and 4 is that Optimize is guaranteed to compute an abstract candidate in some future iteration. Corollary 1. If F has a solution, then Optimize returns a (Ki+r, ABi+r) abstract candidate in iteration i + r for some integer r 0. Proof. By Lemma 3 it suffices to show that there is an r 0 such that the set γA i+r = { x | x var(OR i+r)} can be extended to a solution to DEF(ABi+r) Ki+r. This is established by Lemma 4. Finally, the proof of Theorem 2 follows from the previous statements. Proof of Theorem 2. Observations 1 and 2 together with Corollary 1 imply that Optimize satisfies the correctness condition (assumption 1 of Theorem 1). Assumptions 2 and 3 of Theorem 1 follow directly from the definition of core-guided instantiations. In summary, Theorem 2 provides an alternative way of establishing the correctness of a range of core-guided algorithms by arguing that a core-guided algorithm at hand falls is a core-guided instantiation. We also establish a similar result for a specific case of core-guided instantiations which we will refer to as cardinality-based CG instantiations. The notion of cardinality-based CG instantiations of Uni Max SAT captures in particular core-guided algorithms which introduce a single cardinality constraint over each extracted core. Ihalainen, Berg, & J arvisalo Definition 6. We say that an instantiation of Uni Max SAT is cardinality-based CG instantiation if the following conditions hold. (i) Extract-Abstract Core is a core-extracting SAT solver. (ii) Given a core C as input, Add-Abstraction Sets computes an abstraction set ab = (in, D, out) for which the following hold. The set of variables of D intersects the set of previous variables only on the inputs, i.e., var(D) var(F DEF(AB)) in. The number of outputs is one less than the number of variables in the core, i.e., |out| = |C| 1. P b C τ(b) = 1+P o out τ(o) holds for any solution τ to {C} {D} K DEF(AB). Here K and AB are the set of cores computed and abstraction sets introduced by the iteration in which C is obtained. (iii) Optimize is instantiated as Optimize-CB which maintains a reformulated objective OR, initialized to be the objective O of the input Max SAT instance. In each iteration i, Optimize-CB returns the assignment { x | x var(OR)} that sets all variables in OR to 0 and the constant term of OR as the lower bound. Given an abstract core C and an abstraction set ab = (in, D, out), Optimize-CB updates OR as follows. (1) The coefficient of each x C is decreased by the minimum over the coefficients in O of the variables in the core, i.e., by w C = minx C{OR(x)}. (2) Each x out is added to OR with the coefficient w C. (3) The variables in OR with coefficients 0 are removed. (4) The constant term of OR is incremented by w C. We establish that cardinality-based CG instantiations of Uni Max SAT are a special case of core-guided instantiations. Theorem 3. Any cardinality-based CG instantiation (Definition 6) is a core-guided instantiation of Uni Max SAT (Definition 5). Note that this implies that ny cardinality-based CG instantiation will return an optimal solution to any Max SAT instance by Theorem 2. Proof. The non-trivial part of the proof is to argue that in each iteration i, we have OR i (τ) = O(τ) for any solution τ to DEF(ABi) Ki, i.e., the definitions of the abstraction sets added and cores computed by iteration i. The proof is by induction on the iteration i. The base case = 1 directly follows from OR = O. Now assume that the statement holds in iteration i 1. Let OR i be the reformulated objective in the beginning of iteration i and assume without loss of generality that an abstract core Ci is extracted in iteration i. Let abi+1 = (ini+1, Di+1, outi+1) be the abstraction set computed by Add-Abstraction Sets on input Ci. Then OR i+1(τ) = OR i+1(τ) + X x Ci w Ciτ(x) ( X x outi+1 w Ciτ(x) + w Ci) = OR i (τ). Unifying SAT-Based Approaches to Maximum Satisfiability Solving Here follows by the fact that Add-Abstraction Sets fits the definition of a cardinalitybased CG instantiation stating that P x Ci τ(x) = 1 + P x outi+1 τ(x), which implies x Ci w Ciτ(x) = w Ci + X x outi+1 w Ciτ(x). 6.2 Contrasting Core-Guided and IHS Algorithms through Uni Max SAT Before moving on to concretely capturing the core-relaxation steps of individual core-guided algorithms, we note that the definition of core-guided instantiations of Uni Max SAT allows for observing an interesting contrast between core-guided and IHS. In particular, in contrast to IHS, the cores extracted by core-guided instantiations of Uni Max SAT always refute all possible minimum-cost solutions to the cores accumulated by that iteration. Proposition 2. Consider the ith iteration of a core-guided instantiation of Uni Max SAT invoked on a Max SAT instance F = (F, O). Let Ki and ABi be the set of abstract cores and abstraction sets accumulated by the beginning of iteration i, respectively. Assume that Optimize returns a (Ki, ABi) abstract candidate and Extract-Abstract Core a core Ci. Let then γA be any (Ki, ABi) abstract candidate and τ E γA an extension of γA to a solution to DEF(ABi) Ki Then τ E(Ci) = 0. Proof. We show that τ E { x | x var(OR i )}; since Ci var(OR i ), this implies τ E(Ci) = 0. Since Optimize returns a (Ki, ABi) abstract candidate at the ith iteration, we have OR i (τ E) = O(τ E) = Wi. As OR i (τ E) = Wi holds if and only if τ(x) = 0 for all x var(OR i ), we conclude that τ E { x | x var(OR i )}. Less formally, Proposition 2 states that whenever a core-guided instantiation of Uni Max SAT extracts a core falsified by a (K, AB) abstract candidate γ of F where K and AB are a set of cores and abstraction sets, respectively, the core refutes not only γ but all possible (K, AB) abstract candidates, and thereby all minimum-cost (K, AB) candidates. In contrast, the following example demonstrates that cores extracted by IHS over an abstract candidate do not necessarily refute all possible abstract candidates. Example 9. Consider the following execution of IHS (simulated in Uni Max SAT) on the Max SAT instance F = (F, O) with F = {(b1 b2 b3), (b2 b4)} and O = b1 + b2 + b3 + b4. Assume that Extract-Abstract Core first extracts the core C1 = (b1 b2 b3). Further, assume that Optimize then returns γ = { b1, b2, b4} as the ({C1}, ) abstract candidate6. Finally, assume that Extract-Abstract Core next returns the core C2 = (b2 b4). Although γ(C2) = 0, C2 does not refute all ({C1}, ) abstract candidates. This is because { b1, b3, b4} is a ({C1}, ) abstract candidate and τ E = { b1, b2, b3, b4} an extension of it to a solution to {C1} that satisfies C2. 6. Recall that IHS simulated in Uni Max SAT does not add abstractions sets. Ihalainen, Berg, & J arvisalo 6.3 Capturing Algorithm-Specific Core Relaxations of Core-Guided Algorithms Having established general conditions for the correctness of core-guided algorithms, we move on to detailing how the algorithm-specific core relaxations of modern core-guided algorithms can be viewed as cardinality-based CG instantiations of Uni Max SAT. Specifically, we detail this individually for OLL (Andres et al., 2012; Morgado et al., 2014), WPM3 (Ans otegui & Gab as, 2017), MSU3 (Marques-Silva & Planes, 2007), PMRES (Narodytska & Bacchus, 2014) and K (Alviano et al., 2015) as key representatives of modern core-guided algorithms. More precisely, as the definition of cardinality-based CG instantiations prescribes how Extract-Abstract Core and Optimize are instantiated in each of these algorithms, we now detail how to instantiate Add-Abstraction Sets in ways that fit Lemma 2 and Definition 6, and at the same time match the core-relaxation step of the individual algorithms. The OLL algorithm (recall Section 3.2) is viewed as a cardinality-based CG instantiation of Uni Max SAT by instantiating Add-Abstraction Sets to introduce, given a core C as input, the abstraction set ab C = (C, {as CNF( X x C x i oi) | 2 i |C|}, {o2, . . . , o|C|}), matching the OLL core relaxation. For any reasonable encoding of the cardinality constraint, ab C clearly satisfies condition (ii) of Definition 6. Hence OLL is a cardinality-based CG instantiation of Uni Max SAT. Example 10. Invoke the cardinality-based CG instantiation of Uni Max SAT that corresponds to OLL on the Max SAT instance F = (F, O) from Example 1. Before the main search loop, the reformulated objective OR of Optimize is set to O, and the sets K1 and AB1 both to . In the first iteration, Optimize(O, AB1, K1) returns the assignment γA 1 = { b1, . . . , b5} containing the negation of all variables in OR, and lb1 = 0 corresponding to the constant term of OR. The call Extract-Abstract Core(F, DEF(AB1), γA 1 ) then returns res = false and, e.g., the core C1 = (b1 b2 b3 b4 b5). The set of cores is then updated by letting K2 = {C1}, after which Add-Abstraction Sets(K2) forms the new abstraction set ab1 = ({b1, b2, b3, b4, b5}, {as CNF( i=1 bi k o1 k) | 2 k 5}, {o1 2, . . . , o1 5}) and Uni Max SAT sets AB2 = {ab1}. In the second iteration Optimize(O, AB2, K2) uses the core C1 and abstraction set ab1 to update OR following Definition 6. The new reformulated objective is OR = 2b3 + b5 + o1 2 + o1 3 + o1 4 + o1 5 + 1 and so Optimize returns γA 2 = { b3, b5, o1 2, o1 3, o1 4, o1 5} and lb2 = 1. The call Extract-Abstract Core(F, DEF(AB2), γA 2 ) then returns res = false and, e.g., the (abstract) core C2 = (o1 2 b3). Adding C2 to K, i.e., letting K3 = {C1, C2}, results in Unifying SAT-Based Approaches to Maximum Satisfiability Solving Add-Abstraction Sets introducing the abstraction set ab2 = ({o1 2, b3}, {as CNF(o1 2 + b3 2 o2 2)}, {o2 2}) . In the third iteration Optimize again updates OR based on ab2 and C2 to be OR = b3 + b5 + o1 3 + o1 4 + o1 5 + o2 2 + 2, and returns γA 3 = { b3, b5, o1 3, o1 4, o1 5, o2 2} and lb3 = 2. This time the call Extract-Abstract Core(F, DEF(AB3), γA 3 ) returns res = true and, e.g., the solution τ = {b1, b2, x, b3, b4, b5} {o1 2, o1 3, o1 4, o1 5, o2 2} which is then returned as an optimal solution to F. 6.3.2 PMRES The PMRES algorithm is viewed as a cardinality-based CG instantiation of Uni Max SAT by instantiating Add-Abstraction Sets to introduce for every core C = (b1 bn) the abstraction set ab C = (C, D = {bi (bi+1 bn) oi | 1 i n 1}, {o1, . . . , on 1}). In practice, the definition of each oi is represented in CNF in the style of the standard Tseitin encoding (Tseitin, 1983; Prestwich, 2021) by taking the name di for the disjunction (bi+1 bn), i.e., adding clauses equivalent to di (bi+1 bn).7 To establish the correctness of PMRES in terms of Uni Max SAT through Theorem 3, we argue that this instantiation of Add-Abstraction Sets satisfies condition (ii) of Definition 6. Consider a solution τ to D {C}. Let in = {bi1, . . . , bim} be the set of variables occurring in C assigned to 1 by τ. We show that the set of outputs that τ assigns to 1 is out = {oi1, . . . oim 1}. As τ is a solution to D, it assigns all o out to 1. In the opposite direction, any other output ok / out assigned to 1 by τ would result in a variable of the core bk / in also being assigned to 1 by τ, which is a contradiction. The core relaxation of the K algorithm is intuitively a combination of the PMRES and OLL relaxations. K partitions the found cores into subsets of bounded size, relaxes each partition similarly to OLL and then merges the relaxed partitions similarly to PMRES. More precisely, given a core C = (b1 . . . bmk), K partitions C into m subsets Pi, each of size k such that C = P1 . . . Pm. Each partition P is relaxed with a cardinality constraint {as CNF(P b P b l o P l ) | l = 1, . . . , k} similarly as in OLL. Finally, the cardinality constraints of each partition are merged by adding a PMRES-style constraint of the form {o Pi 1 (o Pi+1 1 o Pm 1 ) o R i | 1 i m 1} where o P 1 is the first output of the cardinality constraint introduced for the partition P. For formalizing K as a cardinality-based instantiation of Uni Max SAT, consider an instantiation of Add-Abstraction Sets that returns a single abstraction set that combines both of these relaxations. More precisely, consider a core C of size mk partitioned into C = P1 . . . Pm by m subsets P1, . . . , Pm with Pi = (b Pi 1 . . . b Pi k ). We define three 7. The auxiliary di variables do not obstruct the main observations made here. Ihalainen, Berg, & J arvisalo different types of abstraction sets: ab Pi = (Pi, {as CNF( X b Pi b l o Pi l ) | l = 1, . . . , k}, {o Pi 1 , . . . , o Pi k }) for each i = 1 . . . m, ab R = ({o Pi 1 , | i = 1 . . . m}, {o Pi 1 (o Pi+1 1 o Pm 1 ) o R i | 1 i m 1}, {o R 1 , . . . , o R m 1}), ab = {C, DEF({ab R}) j=1 DEF(ab Pj), out}, where out = {o R 1 , . . . , o R m 1} {o Pj i | j = 1, . . . , m and i = 2, . . . , k}. The intuition underlying these sets is that each ab Pi corresponds to the OLL-style relaxation of the partition Pi and the set ab R to the PMRES-style relaxation that combines all of them. In other words, the set ab collects all of the relaxations into a single abstraction set that satisfies the condition of Definition 6. The fact that an instantiation of Add-Abstraction Sets which on input C returns the abstraction set ab satisfies condition (ii) of Definition 6 follows straightforwardly albeit being somewhat tedious to formally prove as a consequence of the arguments we already made for PMRES and OLL. For some intuition, note that the set out contains n(k 1)+(m 1) = mk 1 output variables, which is one less than the number of variables in C. Furthermore, any solution τ to the definitions of C and definitions of ab assigns in each ab Pi exactly the same number of inputs and outputs to 1. The output with index 1 from each Pi is then further relaxed by the PMRES-style abstraction set ab R. This ensures that exactly one less output in out will be assigned to 1. The MSU3 algorithm is specific to unweighted Max SAT instances, i.e., instances in which objective coefficients are all equal. On an unweighted Max SAT instance (F, O) MSU3 maintains a single cardinality constraint b active b bound obound where the set active contains the objective variables that have occurred in the so-far extracted cores. The bound bound counts the number of cores that have been extracted so far. When a new core is extracted, the objective variables in the core are added to active and the bound is incremented by one. Informally speaking, in each iteration, the SAT solver is queried for a solution that sets exactly bound objective variables to 1. The increment is due to the fact that each new core obtained implies that the optimal cost of the instance is at least one higher than bound. To see that MSU3 is a cardinality-based CG instantiation of Uni Max SAT, we define an instantiation of Add-Abstraction Sets that corresponds to the core relaxation performed Unifying SAT-Based Approaches to Maximum Satisfiability Solving by MSU3 as just-described. When the first core C1 is extracted, Add-Abstraction Sets initializes a bound bound to 1 and introduces the abstraction set ab1 = (in1, {as CNF b in1 b bound + j oj | 1 j |C| 1}, out1), where in1 = C1 and out1 = {o1, . . . o|C| 1}. In iteration i > 1 the obtained core Ci is first extended with all outputs in outi 1 of the previous abstraction set abi 1. The resulting C i = Ci outi 1 is clearly a core as well. Then the bound is incremented by one and a new abstraction set abi = (ini, {as CNF b ini b bound + j oj | 1 j |C i| 1}, {oi 1, . . . oi |C i| 1}), introduced. Here ini = ini 1 (C var(O)) contains the objective variables that are in Ci and the inputs ini 1 of the previous abstraction set. Notice that due to the coreextension step and the instance being unweighted, all outputs of abi 1 are removed from the reformulated objective, and as such ignored in subsequent iterations. The core extension step is a minor technical detail required to fit the formalization of MSU3 into the definition of a cardinality-based CG instantiation of Uni Max SAT. It does not affect the algorithm in any meaningful way. In the formalization Optimize-CB returns γA 1 = { oi t | t = 1, . . . , |C i| 1} in iteration i. An exact correspondence to the description of MSU3 given at the beginning of the section would instead return γA 2 = { oi 1}. These two are, however, essentially equal since oi t oi t+1 holds for all t. Specifically, there is no (K, AB) candidate of the instance for current set K and AB of cores and abstraction sets, respectively, that would extend γA 2 but not γA 1 . MSU3 can be seen as a special case of the WPM3 algorithm discussed next. As such a formal proof of the fact that the abstraction set abi satisfies condition (ii) of Definition 6 follows from the corresponding proof for WPM3, provided in Appendix A. Informally, the proofs make use of the fact that if k inputs are assigned to 1 by a solution τ, then k bound and k bound outputs are assigned to 1 by τ. The WPM3 algorithm combines elements of OLL and MSU3 in that it maintains a set of several cardinality constraints, but only over objective variables. More precisely, assume that WPM3 on a Max SAT instance (F, O) extracts the core C = (o C1 t1 . . . o Cn tn b1 . . . bm). Each o Ci ti is an output of a cardinality constraint introduced in the relaxation of a previous core Ci and each bi an objective variable. WPM3 relaxes C by introducing a new cardinality constraint b in C b bound C obound C Ihalainen, Berg, & J arvisalo where in C contains the objective variables of C and the inputs of all cardinality constraints whose outputs appear in C. The bound bound C is defined recursively as ( 1 if C only contains objective variables, 1 + Pn i=1 boundi else. Here boundi is the bound of the abstraction set introduced when relaxing a previously found core Ci. All of the cardinality constraints the outputs of which appear in C are removed from the working instance. Conceptually, the new cardinality constraint merges the inputs of the constraints whose outputs appear in C, and the bound is the number of inputs that can be inferred to 1 by the cores extracted so far8. The formalization of WPM3 as a cardinality-based CG instantiation of Uni Max SAT is similar to the formalization of MSU3. In terms of Uni Max SAT, a core extracted by WPM3 is of the form C = (oab1 t1 . . . oabn tn b1 . . . bm). Here oabi ti is an output of the abstraction set abi introduced earlier. The instantiation of Add-Abstraction Sets that corresponds to WPM3 first extends C to C C by adding, for each output oabi ti C, all of the outputs of abi that are in the reformulated objective maintained by Optimize-CB. The resulting C remains a core since all outputs of a fixed abstraction set have the same coefficient when introduced to the reformulated objective. Thus C either contains all of the outputs of a previously introduced abstraction set or none of them. A new abstraction set ab C = (in C , DC = {as CNF b ini b bound C + j o C i j | 1 j |C i| 1}, out C ) is then introduced. The inputs in C = {b1, . . . , bm} Sn i=1 ini consist of the objective variables in C and the inputs ini of all previous abstraction sets abi whose outputs appear in C . The outputs out C = {o C i 2 , o C i 3 , . . . , o C i |C i|} are new variables. The bound bound C of the abstraction set ab C is defined analogously to the bounds on cardinality constraints as ( 1 if C only contains objective variables, 1 + Pn i=1 boundi else. Here boundi is the bound of the abstraction set abi. The definitions DC ensure that the outputs count the number of new inputs in addition to bound C assigned to 1. A formal proof of the fact that the abstraction set ab C satisfies condition (ii) of Definition 6 is provided in Appendix A. Informally, the result follows from three observations: (i) The definitions DC ensure that any solution assigning k bound C inputs to 1 will assign k bound C outputs to 1; (ii) the definitions of previous abstraction sets ensure that such a solution will assign k bound C variables of C to 1; and (iii) the set of accumulated cores ensures that at least bound C inputs will be assigned to 1 in any solution. 8. As a minor technical remark, if objective variables with different coefficients appear in cores together, they may end up as inputs in different cardinality constraints. Whenever the inputs of cardinality constraints merged contain the same variable, the union operator should be understood as additive union for which, e.g., {x, y} {x} = {x, x, y}. Unifying SAT-Based Approaches to Maximum Satisfiability Solving 6.4 On the Frequency of Abstract Candidates We end this section with observations on how frequently core-guided algorithms compute abstract candidates. Recall that whenever an instantiation of Uni Max SAT computes a (K, AB) abstract candidate of the instance with the current set K and AB of cores and abstraction sets, respectively, a lower bound as high as possible given the cores extracted sofar is obtained. Intuitively, the more frequently (K, AB) abstract candidates are computed during the search, the faster in terms of iterations new lower bounds are obtained. First, we will show that in the general case, cardinality-based CG instantiations of Uni Max SAT will not (necessarily) compute a (K, AB) abstract candidate in every iteration. Example 11. Consider the Max SAT instance F = (F, O) with F = {(b1), (b2)} and O = b1 + 2b2. Invoke a cardinality-based CG instantiation of Uni Max SAT on F. Assume that the first core extracted is C1 = (b1 b2). Then Add-Abstraction Sets introduces an abstraction set ab1 with one output o1. Further, Optimize-CB updates its reformulated objective to OR 1 = b2 + o1 and returns the set of assumptions γA = { b2, o1}. Since Add Abstraction Sets fulfills condition (ii) of Definition 6 and any solution to F assigns two of the variables in C1 to 1, any solution assigns the variable o1 to 1. Thus C2 = (o1) is an abstract core that can be extracted in the next iteration, which results in an abstraction set ab2 without any outputs. In the next call to Optimize-CB its objective is updated to OR 2 = b2 and the set γA = { b2} is returned. Now γA is not a ({C1, C2}, {ab1, ab2}) abstract candidate since all solutions to DEF({ab1}) {C1, C2} assign b2 to 1. Example 11 is stated in terms of a generic instantiation of Uni Max SAT and hence applies to all cardinality-based CG instantiations, including OLL, PMRES, K, WPM3, and MSU3. In other words, each of these algorithms may compute intermediate lower bounds that are weaker than what could be inferred based on the cores extracted so far. In contrast, it turns out that these algorithms differ in this respect when restricting to unweighted Max SAT instances in which all objective coefficients are equal. Specifically, PMRES and WPM3 are guaranteed to always compute abstract candidates when invoked on an unweighted instance, thereby obtaining as strong lower bounds as possible, while this is not the case for OLL. To establish this, we first show that there are unweighted instances on which Uni Max SAT instantiated as OLL may not always compute abstract candidates. For some intuition on the reasons for this, in contrast to PMRES, the outputs of abstraction sets introduced by OLL can lead to situations where already-extracted cores imply other cores irrespectively of the input instance. In contrast to WPM3, at the same time the outputs introduced by OLL need not be removed from the reformulated objective. Example 12. Consider an invocation of Uni Max SAT instantiated as OLL on an unweighted Max SAT instance F. Assume two cores C and D are extracted, both containing three variables. This results in the introduction of the abstraction sets ab C = (C, {as CNF( X x C x i o C i ) | i = 2, 3}, {o C 2 , o C 3 }) and ab D = (D, {as CNF( X x D x i o D i ) | i = 2, 3}, {o D 2 , o D 3 }). Ihalainen, Berg, & J arvisalo {o C 2 , o C 3 } {o D 2 , o D 3 } C1 = (o C 2 o D 2 ) C2 = (o C 3 ) C3 = (o D 3 ) ab1 ab2 ab3 Figure 3: Structure of cores and abstraction sets of Example 12. Each pair of connected ellipse and rectangle nodes corresponds to an abstraction set, with the inputs (the extracted core) of that set appearing in the ellipse and the outputs in the rectangle. The dashed edges visualize how outputs of ab C and ab D appear as inputs to new abstraction sets. Assume that C1 = (o C 2 o D 2 ) is the next core extracted with the corresponding abstraction set ab1 = (C1, as CNF(P x C1 x 2 o C1 2 ), {o C1 2 }). Finally, let the next two cores extracted be C2 = (o C 3 ) and C3 = (o D 3 ), resulting in the two abstraction sets ab2 = ({(o C 3 }, , ) and ab3 = ({o D 3 }, , ). Figure 3 illustrates the abstraction sets added after extracting these cores. In the subsequent iteration, the partial assignment δE computed by Optimize-CB will include o C1 2 . However, then δE is not a (K, AB) abstract candidate, where K = {C, D, C1, C2, C3} and AB = {ab C, ab D, ab1, ab2, ab3}. This is because there is no extension of δE to a solution to K DEF(AB). To see this, note that any solution τ to K DEF(AB) has to assign τ(o C 3 ) = τ(o D 3 ) = 1 and therefore also τ(o C 2 ) = τ(o D 2 ) = 1. By the definitions in ab1 this implies τ(o C1 2 ) = 1. In contrast to OLL, when invoked on an unweighted Max SAT instance, both PMRES and WPM3 are guaranteed to compute abstract candidates in each iteration, and thus both algorithms are guaranteed to obtain as strong lower bounds as possible. This is formalized in the following proposition, the proof of which is provided in Appendix A. Proposition 3. Invoke Uni Max SAT instantiated as PMRES or WPM3 on an unweighted Max SAT instance F = (F, O). In the ith iteration of search, let OR i be the reformulated objective maintained by Optimize-CB. Let also Ki be the set of cores, and ABi the collection of abstraction sets collected so far. The partial assignment γA i = { x | x var(OR i )} returned by Optimize-CB is a (Ki, ABi) abstract candidate. The results of Proposition 3 and Example 12 demonstrate the potential of Uni Max SAT for analyzing existing SAT-based Max SAT solving algorithms. 7. Uni Max SAT as Basis for New Algorithmic Variants We emphasize that the main contributions of this work are the formal Uni Max SAT framework and the unifying proofs of correctness for established SAT-based Max SAT solving approaches the framework yields. However, beyond the already-presented main contributions, we more shortly point out that the framework can also be used for obtaining new algorithmic variants of the SAT-based Max SAT solving approaches and thereby to Unifying SAT-Based Approaches to Maximum Satisfiability Solving provide proofs of correctness for such variants. To illustrate this further potential of the Uni Max SAT framework, we describe a novel variant Abst CG of core-guided search as an instantiation of Uni Max SAT. While Abst CG could be designed on its own, viewing it as an instantiation of Uni Max SAT immediately implies that this new algorithmic variant is correct, highlighting the usefulness of Uni Max SAT in developing new correct Max SAT algorithms. For Abst CG, similarly as for other core-guided algorithms, Extract-Abstract Core is a core-extracting SAT solver, Add-Abstraction Sets introduces abstraction sets for each core and Optimize maintains a reformulated objective OR and always returns { x | x var(OR)} as assumptions. Given a core C consisting of variables that have m different coefficients in OR, the instantiation of Add-Abstraction Sets in Abst CG first partitions C into m disjoint sets C = G1 . . . Gm so that all variables in the same set Gi have the same coefficients, with the sets Gi indexed by decreasing coefficients. Starting from G1 (corresponding to the largest coefficient in OR), Abst CG introduces for each Gi an abstraction set abi = (ini, {as CNF( X x ini x j oi,j) | 1 j |ini|}, outi). The inputs ini = Gi outi 1 consist of the variables in Gi and the outputs of abi 1. Since C is an abstract core, at least one of its variables is assigned to 1 in any solution. Hence the first output om,1 of the last abstraction set is not included in outm. The Optimize instantiation in Abst CG updates the reformulated objective OR by processing each abstraction set abi = (ini, Di, outi) in order, starting from i = 1. The coefficient of each x ini is decreased by wi = min({OR(x) | x ini}) and each output x outi is included in OR with coefficient wi. After processing each abi, a constant wm is added to OR. Example 13. Invoke Abst CG on the Max SAT instance F = (F, O) with F = {(b1 b2 b3 b4 b5)} as CNF( X 2 i 5 bi 3)} and O = b1 + 5b2 + 5b3 + 5b4 + 5b5. Assume that the first core is C = (b1 b2 b3 b4 b5). Core relaxation divides the variables of C into G1 = {b2, b3, b4, b5} and G2 = {b1}. The abstraction set over G1 has four output variables o1,j with 1 j 4, defined by as CNF(P x G1 x j o1,j). The abstraction set over G2 has five output variables o2,j with 1 j 5, defined by as CNF(P x G2 out1 x j o2,j). After reformulation, the objective OR is OR = 4o1,1 + 4o1,2 + 4o1,3 + 4o1,4 + o2,2 + o2,3 + o2,4 + o2,5. The assumptions γA for the next call to Extract-Abstract Core consist of the negations of variables in OR, γA = { o1,1, o1,2, o1,3, o1,4, o2,2, o2,3, o2,4, o2,5}. Assume that next, the unit cores (o1,1), (o1,2), (o1,3), (o2,2), (o2,3) are extracted. After this, the assumptions γA for the next call to Extract-Abstract Core are γA = { o1,4, o2,4, o2,5}. Then Extract-Abstract Core returns, for example, the solution τ = { b1, b2, b3, b4, b5} as an optimal solution to F. For an informal connection between Abst CG and OLL, note that when the variables C all have the same coefficient in OR, the reformulation performed by Abst CG is the same Ihalainen, Berg, & J arvisalo b1, b2b2b2, b3b3b3, b4b4b4, b5b5b5 o1, o2 o2 o2, o3 o3 o3, o4 o4 o4, o5 o5 o5 b2, b3, b4, b5 o1,1 o1,1 o1,1, o1,2 o1,2 o1,2, o1,3 o1,3 o1,3, o1,4 o1,4 o1,4 b1 o2,1, o2,2 o2,2 o2,2, o2,3 o2,3 o2,3, o2,4 o2,4 o2,4, o2,5 o2,5 o2,5 Figure 4: Left: An abstraction set introduced by OLL (left) and Abst CG (right) when relaxing core C1 = {b1, b2, b3, b4, b5} from Example 13. Connected ellipse and rectangle nodes correspond to an abstraction set, with the inputs of a set mentioned within the ellipse and the outputs within the rectangle. Variables with a positive coefficient in the reformulated objective OR after the reformulation are highlighted in boldface. as the one performed by OLL. The two algorithms differ in how cores with more than one distinct coefficient are processed. To illustrate this difference, consider the abstraction sets introduced by Abst CG and OLL when relaxing the core C1 = (b1 b2 b3 b4 b5) from Example 13. Figure 4 illustrates the abstraction sets introduced by OLL (left) and Abst CG (right). In the figure, each pair of connected ellipse and rectangle nodes corresponds to an abstraction set. The inputs of each abstraction set are represented by an ellipse and the outputs by a rectangle. Additionally, the variables with a positive coefficient in the reformulated objectives OR = 4b2 + 4b3 + 4b4 + 4b5 + o2 + o3 + o4 + o5 for OLL and OR = 4o1,1 + 4o1,2 + 4o1,3 + 4o1,4 + o2,2 + o2,3 + o2,4 + o2,5 for Abst CG are highlighted in boldface. We observe that OLL keeps most of the variables in C1 in the reformulated objective. In contrast, Abst CG removes all of the variables in the core from the objective, and instead introduces new variables that keep track of how many of the variables in the set {b2, b3, b4, b5} are assigned to 1 in subsequent iterations. We establish the correctness of Abst CG as a core-guided instantiation of Uni Max SAT via Theorem 2. The non-trivial part is to argue that the reformulated objective OR maintained by Abst CG preserves the costs of all solutions. Proposition 4. Assume that the Abst CG instantiation of Uni Max SAT is invoked on a Max SAT instance. Let OR i (τ) be the reformulated objective in iteration i and ABi and Ki the set of abstraction sets introduced and cores obtained by iteration i, respectively. Let τ be a solution to DEF(ABi) Ki. Then OR i (τ) = O(τ). Proof. By induction on the iteration i. The base case i = 1 follows from OR = O. Assume then that the statement holds in iteration i 1. Let OR i be the reformulated objective in iteration i (i.e., the reformulated objective in the beginning of iteration i). Assume without loss of generality that an abstract core Ci is extracted in iteration i and partitioned into m subsets G1, . . . , Gm. Let abi j = (ini j, Di j, outi j) for j = 1, 2, . . . , m be the abstraction sets computed by Add-Abstraction Sets on input Ci and let wi j be the minimum weights of Unifying SAT-Based Approaches to Maximum Satisfiability Solving each Gj for j = 1, 2, . . . , m. Then OR i+1(τ) = OR i+1(τ) + wi jτ(o)) wi m = OR i (τ). Step follows from the observation that P b ini j τ(b) = P o outi j τ(o) for all j < m, which implies P b ini j wi jτ(b) P o outi j wi jτ(o) = 0. Furthermore, since Ci is a core, we have P b inim τ(b) 1. Recalling that oi m,1 / outi m we have P b inim wi mτ(b) P o outim wi mτ(o) wi m = 0, and can conclude that P b inim τ(b) = P o outim τ(o) + 1. While the main focus of this work is on the Uni Max SAT framework, we developed a prototype implementation of Abst CG on top of CGSS2, a state-of-the-art C++ implementation of OLL (Ihalainen, 2022). This implementation of Abst CG is available online at https://bitbucket.org/coreo-group/cgss2/ as a command line option of CGSS2. When a new core is extracted, the prototype implementation dynamically selects between relaxing it in the style of OLL and relaxing it in the style of Abst CG, always choosing the relaxation that results in fewer clauses added. Intuitively, this choice aims to balance the benefits of relaxing cores in the style of Abst CG (such as removing all literals in the core from the reformulated objective) with the potentially smaller size of the core-relaxation constraint of OLL. We empirically compare the runtimes of CGSS2Abst CG (our prototype implementation of Abst CG) to those of CGSS2-OLL, i.e., the base CGSS2 implementation. We emphasize that the goal of these experiments is to demonstrate that Uni Max SAT allows for novel algorithmic instantiations that can be used to obtain practical solvers competitive with the state-of-the-art. As benchmarks, we used the 558 instances weighted instances9 from the exact track of Max SAT Evaluation 2023 (https://maxsat-evaluations.github.io/2023/). The experiments were run using 2.6GHz AMD EPYC 7H12 processors under a per-instance 3600-second time and 16-GB memory limit. Figure 5 (left) provides a runtime comparison for the two solvers with all additional heuristics implemented in CGSS2 enabled. We observe that CGSS2-Abst CG is competitive with base CGSS2 (referred to as CGSS2-OLL in the following). CGSS2-Abst CG exhibits somewhat faster runtimes on instances that take around 2000 seconds to solve, CGSS2Abst CG solving 415 instances within 2100 seconds compared to CGSS2-OLL solving 412. Both solvers solve 419 instances within the per-instance time limit. For a more fine-grained view, we also ran both solvers with the weight-aware core extraction (WCE) (Berg & J arvisalo, 2017) and structure sharing (SS) (Ihalainen et al., 2021) techniques disabled. We note that with these two techniques CGSS2 delays the core-relaxation steps and heuristically attempts to order the literals in the individual core relaxations in a way that allows reusing constraints between multiple core relaxations. Figure 5 (right) provides a runtime comparison for the two solvers with WCE and SS disabled. Interestingly, disabling these heuristics seems to degrade the performance of CGSS2-OLL more than of CGSS2-Abst CG, 9. Note that when all literals in a core to be relaxed have the same coefficients, the reformulation used by Abst CG is exactly the same as the one used by OLL. We hence excluded the unweighted benchmarks from the experiments. Ihalainen, Berg, & J arvisalo 380 385 390 395 400 405 410 415 420 425 Number of solved instances CGSS2-OLL CGSS2-Abst CG 380 385 390 395 400 405 410 415 420 425 Number of solved instances CGSS2-OLL (WCE+SS disabled) CGSS2-Abst CG (WCE+SS disabled) Figure 5: Left: runtime comparison of OLL and Abst CG with WCE and SS enabled. Right: Runtime comparison of OLL and Abst CG with WCE and SS disabled. with the former solving 406 instances and the latter 412 within the time limit. The results suggest that a solver using purely OLL benefits more from WCE and SS than a solver using the Abst CGcore relaxation. Overall, as a proof of concept, Abst CG appears an interesting example of a new instantiation of the general framework. Beyond the main focus of this work, we note that a more fine-grained integration of the Abst CG relaxation within CGSS2 could lead to further improvements in solver runtimes. 8. Conclusions Building on the recently-proposed notion of abstract cores, we developed a general algorithmic framework that captures in a unifying way the computations performed by SAT-based Max SAT solvers. The framework covers the three most popular modern practical algorithmic variants to Max SAT, namely, the core-guided, the implicit hitting set (IHS), and the objective-bounding approaches, variants of which are today implemented in various publicly-available Max SAT solvers. The framework provides a uniform way of proving the correctness of the current and potential forthcoming algorithms of the three approaches, as well as algorithms combining techniques of the different approaches. To illustrate this, we formally detailed how the framework captures various existing instantiations of the approaches. The framework also suggests novel algorithmic variants through different instantiations; we detailed one such instantiation and showed as a proof of concept that it resulted in a potentially interesting solver variant for Max SAT from a practical perspective. While the framework developed in this work captures quite generally the current mainstream approaches to SAT-based Max SAT solving, as a potential direction for further study it would be interesting to develop further understanding on the distinguishing features of branch-and-bound based Max SAT solvers and their connnections to our framework. Furthermore, while our discussion was grounded in Max SAT, we note that the framework could be extended to cover related constraint optimization paradigms such as pseudo Boolean optimization (Devriendt et al., 2021; Smirnov et al., 2021, 2022), finite-domain constraint optimization (Delisle & Bacchus, 2013; Gange et al., 2020), and answer set programming (Andres et al., 2012; Saikko et al., 2018; Alviano & Dodaro, 2020) for which Unifying SAT-Based Approaches to Maximum Satisfiability Solving core-guided and IHS-style solvers have been developed. Such extensions would seem reasonable, since as implemented in the already-proposed solvers for PBO, COP and ASP the core-guided and IHS approaches are essentially agnostic to the constraint language at hand, assuming that a suitable decision oracle for core extraction exists. In particular, the Uni Max SAT framework could analogously be presented on the level of these more high-level constraint languages instead of the propositional representation focused on in this article. Studying new ways of instantiating the framework towards developing novel practical SATbased algorithms for Max SAT and related constraint optimization paradigms is also an interesting direction for further work. Acknowledgments This work was financially supported by University of Helsinki Doctoral Programme in Computer Science Do CS and Research Council of Finland (grants 342145 and 356046). The authors thank the Finnish Computing Competence Infrastructure (FCCI) for computational and data storage resources. Appendix A. Proofs A.1 Correctness of WPM3 via Uni Max SAT The following proposition establishes that the instantiation of Add-Abstraction Sets that simulates core-relaxation steps performed by WPM3 (as detailed in Section 6.3) satisfies condition (ii) of a cardinality-based CG instantiation (Definition 6). In the following, assume that WPM3 instantiated in Uni Max SAT is invoked on a Max SAT instance. Fix an iteration and let AB and K be the set of abstraction sets introduced and cores accumulated at the beginning of the iteration. Assume that a core C = (b1 , . . . , bm oab1 t1 oabn tn ), where each bi is an objective variable and oabk tk is an output of a previously-introduced abstraction set abk, is extracted. As described in Section 6.3, the core C is extended to C = (b1 . . . bm) i=1 (oabi 1 . . . oabi |outi| 1) by adding all outputs of the abstractions sets appearing in C. The new abstraction set ab C = (in C = {b1, . . . , bm} i=1 ini, DC , out C = {o C 1 , . . . , o C |C | 1}) is introduced. Here ini is the input of the abstraction set abi introduced previously, DC = {as CNF b ini b bound C + j o C i j | 1 j |C i| 1} ( 1 if C only contains objective variables, 1 + Pn i=1 boundi else. Ihalainen, Berg, & J arvisalo We show that C fulfills condition (ii) of Definition 6. The non-trivial part to show is that the number of variables in C assigned to 1 by any satisfying assignment to the cores and definitions found so far is one more than the number of outputs of the abstraction set introduced in its relaxation. This is formalized in the following proposition. Proposition 5. Let τ be an assignment satisfying to DEF(AB) K {DC } C . We have that X b C τ(b) = X o out C τ(o) + 1. The proof of Proposition 5 employes the following lemma. Lemma 5. Let τ be an assignment satisfying to DEF(AB) K {DC } C . We have that b in C τ(b) bound C . Proof. By induction on the added abstraction sets. Base Case. C only contains objective variables. Now bound C = 1 and the statement follows by noting that C = C = in C and that τ satisfies C . Induction step. Assume that C contains outputs of previous abstraction sets (for which the statement holds by the induction). Then b in C τ(b) = j=1 boundj = i=1 τ(b) + bound C 1. The induction step is marked with . If Pm i=1 τ(b) 1, we are done. Otherwise, assume that Pm i=1 τ(b) = 0. We establish that there exists an abstraction set abk with more than boundk inputs set to true by τ. Since τ(C ) = 1, we can fix k to be an index for which τ(oabk 1 ) = 1. Now, if the core Ck that prompted the addition of abk does not contain any outputs of previous abstraction sets, we are done, since then boundk = 1 and by the definitions of abstraction sets (satisfied by τ) τ(oabk 1 ) = 1 implies at least two inputs are assigned to 1. Otherwise, we recurse. Since τ(Ck) = 1, there is another abstraction set for which τ assigns at least two inputs to true. At some point, the core corresponding to the found abstraction set will not contain any output literals, at which point the recursion is guaranteed to end. Corollary 2. X b in C τ(b) bound C = X o out C τ(o). Proof. Follows by Lemma 5 and the fact that τ satisfies DC . Unifying SAT-Based Approaches to Maximum Satisfiability Solving Proof of Proposition 5. If C includes no outputs from previous abstraction sets, the statement follows by noting that C = C = in C . Otherwise we have i=1 τ(bi) + o outi τ(o) = i=1 τ(bi) + b ini τ(b) boundi b in C τ(b) i=1 boundi = X b in C τ(b) (bound C 1) = X o out C τ(o) + 1, where the * steps follow by Corollary 2. A.2 PMRES and WPM3 Compute Abstract Candidates on Unweighted Instances We prove Proposition 3 separately for PMRES and WPM3. For the following, let ABi and Ki be the set of abstraction sets and abstract cores extracted so far, respectively. Both proofs make use of the fact that, by Lemma 3, it suffices to show that there exists a solution τi γA i = { x | x var(OR i )} to DEF(ABi) Ki that extends γA i . Proof of Proposition 3 for PMRES. We prove the existence of such a τi by induction on i. More precisely, for every i we construct an assignment τ o i to the objective variables that extends to such a τi. As a tool for the induction, we use a function βi that maps each variable x var(γA i ) to a unique objective variable βi(x) = y var(O). The mapping has the following properties: (1) τ o i (y) = 0 and (2) for every x var(γA i ), any solution δ to DEF(ABi) Ki that agrees with τi on every variable in var(O) except βi(x) also agrees with τi on every variable in var(OR i ) except for x. Base case (i = 1): We have K1 DEF(AB1) = so τ o 1 = γA 1 = { x | x var(O)} and β1(x) = x for each x var(O). Induction Step: Let the core obtained in iteration i be Ci = (x1 xn) and the abstraction set introduced be abi = (Ci, {xj (xj+1 xn) oi j | 1 j n 1}, {oi 1, . . . , oi n 1}). By the induction assumption, there exists an assignment τ o i to the objective variables that extends to a solution τi γA i that satisfies DEF(ABi) Ki and a function βi that satisfies properties (1) and (2) outlined in the beginning of this proof. Let then b = βi(xn) and τ o i+1 = τ o i {b} \ { b} and consider the (unique) extension of τi+1 τ o i+1 to a solution of DEF(ABi+1) = DEF(ABi {abi}). By property (2) of βi, τi+1 agrees with τi on all variables in var(OR i ) except for xn that τi assigns to 0 and τi+1 to 1. As τi is a solution to Ki, and τi+1 assigns τi+1(xn) = 1, it follows that τi+1 is a solution to Ki+1 = Ki {Ci}. Finally, let βi+1(x) = βi(x) for all x except for every xi Ci (as these variables will no longer be part of the assumptions in subsequent iterations) and βi+1(oi j) = βi(xj) for oi j {oi 1, . . . , oi n 1}. Then the mapping βi+1 has the properties (1) and (2) outlined in the beginning of the proof. Proof of Proposition 3 for WPM3. We show the existence of a solution τi γA i = { x | x var(OR i )} to Ki DEF(ABi). The proof is by induction on the iteration i. Base case (i = 1): Immediate by K1 DEF(AB1) = and O = OR. Ihalainen, Berg, & J arvisalo Induction step: Assume that the (extended) core Ci = (b1 . . . bm) i=1 (oabi 1 . . . oabi |outi| 1) is extracted on iteration i, and the new abstraction set abi = (ini, Di, outi) introduced as detailed in Section 6.3.5. By the induction assumption, there exists a solution τi γA i = { x | x var(OR i )} to Ki DEF(ABi). By the properties of the Extract-Abstract Core subroutine, τi falsifies Ci. Let τ o i be the restriction of τi onto the objective variables and select any b ini for which τ o i (b) = 0, at least one such b exists as τi(Ci) = 0. Finally, let τ o i+1 be the assignment to the objective variables that agrees with τ o i on all variables except b and consider the (unique) extension τi+1 τ o i+1 of τ o i+1 to a solution of DEF(ABi+1) = DEF(ABi {abi}). The proposition follows from showing two things: (a) τi+1 satisfies Ki+1 = Ki 1 {Ci}, and (b) τi γA i+1 = { x | x var(OR i+1)}. (a) For the non-trivial case, assume b / Ci. Then b is an input to some abstraction set abj for which oabj 1 C. Then τi+1 assigns boundj +1 inputs of abj to 1. This follows by a recursive argument similar to the one made in the proof of Lemma 5. By the construction of the abstraction sets and by recursing if needed we can assume that the core Cj that prompted the addition of abj only contains objective variables. As τi(Cj) = 1 we have that τi assigns τi(bo) = 1 for at least one variable bo Cj to 1. As b Cj, τi(b) = 0 and τi+1 agrees with τi on all objective variables except b, we have that τi+1 assigns at least 2 variables in Cj to 1. As boundj = 1 τi+1 also assigns τi(oabj 1 ) = 1 and satisfies Ci. Furthermore, by the definitions of abstraction sets, τi+1 assigns at least the same variables to 1 as τi, thus it clearly also satisfies all other cores. We conclude that τi+1 is a solution to Ki+1. (b) Follows from the properties of core-guided instantiations. As τi+1 assigns exactly one more objective variable to 1 than τ, we have that O(τi+1) = O(τi) + 1 which implies OR i+1(τi+1) = OR i (τi+1) = OR i (τi) + 1. By the induction assumption O(τi) is equal to the constant term W lb i of Oi as τi assigns every variable in var(OR i ) to 0. Since W lb i +1 = W lb i+1 it follows that OR i+1(τi+1) = W lb i+1 so it assigns every variable in τi+1 to 0. Ab ıo, I., Nieuwenhuis, R., Oliveras, A., & Rodr ıguez-Carbonell, E. (2013). A parametric approach for smaller and better encodings of cardinality constraints. In Schulte, C. (Ed.), Principles and Practice of Constraint Programming - 19th International Conference, CP 2013, Uppsala, Sweden, September 16-20, 2013. Proceedings, Vol. 8124 of Lecture Notes in Computer Science, pp. 80 96. Springer. Abram e, A., & Habet, D. (2014). Ahmaxsat: Description and evaluation of a branch and bound Max-SAT solver. J. Satisf. Boolean Model. Comput., 9(1), 89 128. Abram e, A., & Habet, D. (2016). Learning nobetter clauses in Max-SAT branch and bound solvers. In 28th IEEE International Conference on Tools with Artificial Intelligence, ICTAI 2016, San Jose, CA, USA, November 6-8, 2016, pp. 452 459. IEEE Computer Society. Unifying SAT-Based Approaches to Maximum Satisfiability Solving Alviano, M., & Dodaro, C. (2020). Unsatisfiable core analysis and aggregates for optimum stable model search. Fundamenta Informaticae, 176(3-4), 271 297. Alviano, M., Dodaro, C., & Ricca, F. (2015). A Max SAT algorithm using cardinality constraints of bounded size. In Yang, Q., & Wooldridge, M. J. (Eds.), Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence, IJCAI 2015, Buenos Aires, Argentina, July 25-31, 2015, pp. 2677 2683. AAAI Press. Andres, B., Kaufmann, B., Matheis, O., & Schaub, T. (2012). Unsatisfiability-based optimization in clasp. In Dovier, A., & Costa, V. S. (Eds.), Technical Communications of the 28th International Conference on Logic Programming, ICLP 2012, September 4-8, 2012, Budapest, Hungary, Vol. 17 of LIPIcs, pp. 211 221. Schloss Dagstuhl - Leibniz-Zentrum f ur Informatik. Ans otegui, C., Bonet, M. L., & Levy, J. (2013). SAT-based Max SAT algorithms. Artificial Intelligence, 196, 77 105. Ans otegui, C., & Gab as, J. (2017). WPM3: An (in)complete algorithm for weighted partial Max SAT. Artificial Intelligence, 250, 37 57. Ans otegui, C., Gab as, J., & Levy, J. (2016). Exploiting subproblem optimization in SATbased Max SAT algorithms. Journal of Heuristics, 22(1), 1 53. As ın, R., Nieuwenhuis, R., Oliveras, A., & Rodr ıguez-Carbonell, E. (2011). Cardinality networks: A theoretical and empirical study. Constraints An International Journal, 16(2), 195 221. Audemard, G., Lagniez, J., & Simon, L. (2013). Improving glucose for incremental SAT solving with assumptions: Application to MUS extraction. In J arvisalo, M., & Gelder, A. V. (Eds.), Theory and Applications of Satisfiability Testing - SAT 2013 - 16th International Conference, Helsinki, Finland, July 8-12, 2013. Proceedings, Vol. 7962 of Lecture Notes in Computer Science, pp. 309 317. Springer. Bacchus, F., J arvisalo, M., & Martins, R. (2019). Max SAT Evaluation 2018: New developments and detailed results. Journal on Satisfiability, Boolean Modeling and Computation, 11(1), 99 131. Bacchus, F., J arvisalo, M., & Martins, R. (2021). Maximum satisfiability. In Biere, A., Heule, M., van Maaren, H., & Walsh, T. (Eds.), Handbook of Satisfiability - Second Edition, Vol. 336 of Frontiers in Artificial Intelligence and Applications, pp. 929 991. IOS Press. Bacchus, F., & Narodytska, N. (2014). Cores in Core Based Max Sat Algorithms: An Analysis. In Sinz, C., & Egly, U. (Eds.), Theory and Applications of Satisfiability Testing - SAT 2014 - 17th International Conference, Held as Part of the Vienna Summer of Logic, VSL 2014, Vienna, Austria, July 14-17, 2014. Proceedings, Vol. 8561 of Lecture Notes in Computer Science, pp. 7 15. Springer. Bailleux, O., & Boufkhad, Y. (2003). Efficient CNF encoding of boolean cardinality constraints. In Rossi, F. (Ed.), Principles and Practice of Constraint Programming - CP 2003, 9th International Conference, CP 2003, Kinsale, Ireland, September 29 - October 3, 2003, Proceedings, Vol. 2833 of Lecture Notes in Computer Science, pp. 108 122. Springer. Ihalainen, Berg, & J arvisalo Bailleux, O., Boufkhad, Y., & Roussel, O. (2009). New encodings of pseudo-boolean constraints into CNF. In Kullmann, O. (Ed.), Theory and Applications of Satisfiability Testing - SAT 2009, 12th International Conference, SAT 2009, Swansea, UK, June 30 - July 3, 2009. Proceedings, Vol. 5584 of Lecture Notes in Computer Science, pp. 181 194. Springer. Barrett, C. W., Sebastiani, R., Seshia, S. A., & Tinelli, C. (2021). Satisfiability modulo theories. In Biere, A., Heule, M., van Maaren, H., & Walsh, T. (Eds.), Handbook of Satisfiability (2 edition)., Vol. 336 of Frontiers in Artificial Intelligence and Applications, pp. 1267 1329. IOS Press. Baselice, S., Bonatti, P. A., & Gelfond, M. (2005). Towards an integration of answer set and constraint solving. In Gabbrielli, M., & Gupta, G. (Eds.), Logic Programming, 21st International Conference, ICLP 2005, Sitges, Spain, October 2-5, 2005, Proceedings, Vol. 3668 of Lecture Notes in Computer Science, pp. 52 66. Springer. Berg, J., Bacchus, F., & Poole, A. (2020). Abstract cores in implicit hitting set Max Sat solving. In Pulina, L., & Seidl, M. (Eds.), Theory and Applications of Satisfiability Testing - SAT 2020 - 23rd International Conference, Alghero, Italy, July 3-10, 2020, Proceedings, Vol. 12178 of Lecture Notes in Computer Science, pp. 277 294. Springer. Berg, J., Demirovic, E., & Stuckey, P. J. (2019). Core-boosted linear search for incomplete Max SAT. In Rousseau, L., & Stergiou, K. (Eds.), Integration of Constraint Programming, Artificial Intelligence, and Operations Research - 16th International Conference, CPAIOR 2019, Thessaloniki, Greece, June 4-7, 2019, Proceedings, Vol. 11494 of Lecture Notes in Computer Science, pp. 39 56. Springer. Berg, J., & J arvisalo, M. (2017). Weight-aware core extraction in SAT-based Max SAT solving. In Beck, J. C. (Ed.), Principles and Practice of Constraint Programming - 23rd International Conference, CP 2017, Melbourne, VIC, Australia, August 28 - September 1, 2017, Proceedings, Vol. 10416 of Lecture Notes in Computer Science, pp. 652 670. Springer. Berre, D. L., & Parrain, A. (2010). The sat4j library, release 2.2. Journal on Satisfiability, Boolean Modeling and Computation, 7(2-3), 59 6. Bjørner, N. S., & Fazekas, K. (2023). On incremental pre-processing for SMT. In Pientka, B., & Tinelli, C. (Eds.), Automated Deduction - CADE 29 - 29th International Conference on Automated Deduction, Rome, Italy, July 1-4, 2023, Proceedings, Vol. 14132 of Lecture Notes in Computer Science, pp. 41 60. Springer. Cai, S., Luo, C., Thornton, J., & Su, K. (2014). Tailoring local search for partial Max SAT. In Brodley, C. E., & Stone, P. (Eds.), Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence, July 27 -31, 2014, Qu ebec City, Qu ebec, Canada, pp. 2623 2629. AAAI Press. Chu, Y., Cai, S., & Luo, C. (2023). Nu WLS: Improving local search for (weighted) partial Max SAT by new weighting techniques. In Williams, B., Chen, Y., & Neville, J. (Eds.), Thirty-Seventh AAAI Conference on Artificial Intelligence, AAAI 2023, Thirty-Fifth Conference on Innovative Applications of Artificial Intelligence, IAAI 2023, Thirteenth Symposium on Educational Advances in Artificial Intelligence, EAAI 2023, Washington, DC, USA, February 7-14, 2023, pp. 3915 3923. AAAI Press. Unifying SAT-Based Approaches to Maximum Satisfiability Solving Cimatti, A., Franz en, A., Griggio, A., Sebastiani, R., & Stenico, C. (2010). Satisfiability modulo the theory of costs: Foundations and applications. In Esparza, J., & Majumdar, R. (Eds.), Tools and Algorithms for the Construction and Analysis of Systems, 16th International Conference, TACAS 2010, Held as Part of the Joint European Conferences on Theory and Practice of Software, ETAPS 2010, Paphos, Cyprus, March 20-28, 2010. Proceedings, Vol. 6015 of Lecture Notes in Computer Science, pp. 99 113. Springer. Codish, M., & Zazon-Ivry, M. (2010). Pairwise cardinality networks. In Clarke, E. M., & Voronkov, A. (Eds.), Logic for Programming, Artificial Intelligence, and Reasoning - 16th International Conference, LPAR-16, Dakar, Senegal, April 25-May 1, 2010, Revised Selected Papers, Vol. 6355 of Lecture Notes in Computer Science, pp. 154 172. Springer. Davies, J., & Bacchus, F. (2011). Solving MAXSAT by solving a sequence of simpler SAT instances. In Lee, J. H. (Ed.), Principles and Practice of Constraint Programming - CP 2011 - 17th International Conference, CP 2011, Perugia, Italy, September 1216, 2011. Proceedings, Vol. 6876 of Lecture Notes in Computer Science, pp. 225 239. Springer. Davies, J., & Bacchus, F. (2013). Postponing optimization to speed up MAXSAT solving. In Schulte, C. (Ed.), Principles and Practice of Constraint Programming - 19th International Conference, CP 2013, Uppsala, Sweden, September 16-20, 2013. Proceedings, Vol. 8124 of Lecture Notes in Computer Science, pp. 247 262. Springer. Delisle, E., & Bacchus, F. (2013). Solving weighted CSPs by successive relaxations. In Schulte, C. (Ed.), Principles and Practice of Constraint Programming - 19th International Conference, CP 2013, Uppsala, Sweden, September 16-20, 2013. Proceedings, Vol. 8124 of Lecture Notes in Computer Science, pp. 273 281. Springer. Devriendt, J., Gocht, S., Demirovic, E., Nordstr om, J., & Stuckey, P. J. (2021). Cutting to the core of pseudo-boolean optimization: Combining core-guided search with cutting planes reasoning. In Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Thirty-Third Conference on Innovative Applications of Artificial Intelligence, IAAI 2021, The Eleventh Symposium on Educational Advances in Artificial Intelligence, EAAI 2021, Virtual Event, February 2-9, 2021, pp. 3750 3758. AAAI Press. E en, N., & S orensson, N. (2003). Temporal induction by incremental SAT solving. In Strichman, O., & Biere, A. (Eds.), First International Workshop on Bounded Model Checking, BMC@CAV 2003, Boulder, Colorado, USA, July 13, 2003, Vol. 89 of Electronic Notes in Theoretical Computer Science, pp. 543 560. Elsevier. E en, N., & S orensson, N. (2006). Translating pseudo-boolean constraints into SAT. Journal on Satisfiability, Boolean Modeling and Computation, 2(1-4), 1 26. Fazekas, K., Bacchus, F., & Biere, A. (2018). Implicit hitting set algorithms for maximum satisfiability modulo theories. In Galmiche, D., Schulz, S., & Sebastiani, R. (Eds.), Automated Reasoning - 9th International Joint Conference, IJCAR 2018, Held as Part of the Federated Logic Conference, Flo C 2018, Oxford, UK, July 14-17, 2018, Proceedings, Vol. 10900 of Lecture Notes in Computer Science, pp. 134 151. Springer. Ihalainen, Berg, & J arvisalo Fazekas, K., Biere, A., & Scholl, C. (2019). Incremental inprocessing in SAT solving. In Janota, M., & Lynce, I. (Eds.), Theory and Applications of Satisfiability Testing - SAT 2019 - 22nd International Conference, SAT 2019, Lisbon, Portugal, July 9-12, 2019, Proceedings, Vol. 11628 of Lecture Notes in Computer Science, pp. 136 154. Springer. Fu, Z., & Malik, S. (2006). On solving the partial MAX-SAT problem. In Biere, A., & Gomes, C. P. (Eds.), Theory and Applications of Satisfiability Testing - SAT 2006, 9th International Conference, Seattle, WA, USA, August 12-15, 2006, Proceedings, Vol. 4121 of Lecture Notes in Computer Science, pp. 252 265. Springer. Gange, G., Berg, J., Demirovic, E., & Stuckey, P. J. (2020). Core-guided and core-boosted search for CP. In Hebrard, E., & Musliu, N. (Eds.), Integration of Constraint Programming, Artificial Intelligence, and Operations Research - 17th International Conference, CPAIOR 2020, Vienna, Austria, September 21-24, 2020, Proceedings, Vol. 12296 of Lecture Notes in Computer Science, pp. 205 221. Springer. Gebser, M., Ostrowski, M., & Schaub, T. (2009). Constraint answer set solving. In Hill, P. M., & Warren, D. S. (Eds.), Logic Programming, 25th International Conference, ICLP 2009, Pasadena, CA, USA, July 14-17, 2009. Proceedings, Vol. 5649 of Lecture Notes in Computer Science, pp. 235 249. Springer. Heras, F., Larrosa, J., & Oliveras, A. (2008). Mini Max SAT: An efficient weighted Max-SAT solver. J. Artif. Intell. Res., 31, 1 32. Heras, F., Morgado, A., & Marques-Silva, J. (2011). Core-guided binary search algorithms for maximum satisfiability. In Burgard, W., & Roth, D. (Eds.), Proceedings of the Twenty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2011, San Francisco, California, USA, August 7-11, 2011. AAAI Press. H olldobler, S., Manthey, N., & Steinke, P. (2012). A compact encoding of pseudo-boolean constraints into SAT. In Glimm, B., & Kr uger, A. (Eds.), KI 2012: Advances in Artificial Intelligence - 35th Annual German Conference on AI, Saarbr ucken, Germany, September 24-27, 2012. Proceedings, Vol. 7526 of Lecture Notes in Computer Science, pp. 107 118. Springer. Ignatiev, A., Morgado, A., Manquinho, V. M., Lynce, I., & Marques-Silva, J. (2014). Progression in maximum satisfiability. In Schaub, T., Friedrich, G., & O Sullivan, B. (Eds.), ECAI 2014 - 21st European Conference on Artificial Intelligence, 18-22 August 2014, Prague, Czech Republic - Including Prestigious Applications of Intelligent Systems (PAIS 2014), Vol. 263 of Frontiers in Artificial Intelligence and Applications, pp. 453 458. IOS Press. Ihalainen, H. (2022). Refined core relaxations for core-guided maximum satisfiability algorithms. Master s thesis, University of Helsinki. http://hdl.handle.net/10138/351207. Ihalainen, H., Berg, J., & J arvisalo, M. (2021). Refined core relaxation for core-guided Max SAT solving. In Michel, L. D. (Ed.), 27th International Conference on Principles and Practice of Constraint Programming, CP 2021, Montpellier, France (Virtual Conference), October 25-29, 2021, Vol. 210 of LIPIcs, pp. 28:1 28:19. Schloss Dagstuhl - Leibniz-Zentrum f ur Informatik. Unifying SAT-Based Approaches to Maximum Satisfiability Solving Ihalainen, H., Berg, J., & J arvisalo, M. (2022). Clause redundancy and preprocessing in maximum satisfiability. In Blanchette, J., Kov acs, L., & Pattinson, D. (Eds.), Automated Reasoning - 11th International Joint Conference, IJCAR 2022, Haifa, Israel, August 8-10, 2022, Proceedings, Vol. 13385 of Lecture Notes in Computer Science, pp. 75 94. Springer. Ihalainen, H., Berg, J., & J arvisalo, M. (2023). Unifying core-guided and implicit hitting set based optimization. In Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI 2023, 19th-25th August 2023, Macao, SAR, China, pp. 1935 1943. ijcai.org. J arvisalo, M., Heule, M., & Biere, A. (2012). Inprocessing rules. In Gramlich, B., Miller, D., & Sattler, U. (Eds.), Automated Reasoning - 6th International Joint Conference, IJCAR 2012, Manchester, UK, June 26-29, 2012. Proceedings, Vol. 7364 of Lecture Notes in Computer Science, pp. 355 370. Springer. Jiang, Y., Kautz, H., & Selman, B. (1995). Solving problems with hard and soft constraints using a stochastic algorithm for MAX-SAT. In Proceedings of the 1st International Joint Workshop on Artificial Intelligence and Operations Research. Joshi, S., Kumar, P., Rao, S., & Martins, R. (2019). Open-WBO-Inc: Approximation strategies for incomplete weighted Max SAT. J. Satisf. Boolean Model. Comput., 11(1), 73 97. Joshi, S., Martins, R., & Manquinho, V. M. (2015). Generalized totalizer encoding for pseudo-boolean constraints. In Pesant, G. (Ed.), Principles and Practice of Constraint Programming - 21st International Conference, CP 2015, Cork, Ireland, August 31 - September 4, 2015, Proceedings, Vol. 9255 of Lecture Notes in Computer Science, pp. 200 209. Springer. Karpinski, M., & Piotr ow, M. (2019). Encoding cardinality constraints using multiway merge selection networks. Constraints An International Journal, 24(3-4), 234 251. Katsirelos, G. (2023). An analysis of core-guided maximum satisfiability solvers using linear programming. In Mahajan, M., & Slivovsky, F. (Eds.), 26th International Conference on Theory and Applications of Satisfiability Testing, SAT 2023, July 4-8, 2023, Alghero, Italy, Vol. 271 of LIPIcs, pp. 12:1 12:19. Schloss Dagstuhl - Leibniz-Zentrum f ur Informatik. Koshimura, M., Zhang, T., Fujita, H., & Hasegawa, R. (2012). QMax SAT: A partial Max SAT solver. Journal on Satisfiability, Boolean Modeling and Computation, 8(1/2), 95 100. Lei, Z., & Cai, S. (2018). Solving (weighted) partial Max SAT by dynamic local search for SAT. In Lang, J. (Ed.), Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI 2018, July 13-19, 2018, Stockholm, Sweden, pp. 1346 1352. ijcai.org. Li, C. M., & Many a, F. (2021). Max SAT, hard and soft constraints. In Biere, A., Heule, M., van Maaren, H., & Walsh, T. (Eds.), Handbook of Satisfiability - Second Edition, Vol. 336 of Frontiers in Artificial Intelligence and Applications, pp. 903 927. IOS Press. Ihalainen, Berg, & J arvisalo Li, C. M., Many a, F., & Planes, J. (2005). Exploiting unit propagation to compute lower bounds in branch and bound Max-SAT solvers. In van Beek, P. (Ed.), Principles and Practice of Constraint Programming - CP 2005, 11th International Conference, CP 2005, Sitges, Spain, October 1-5, 2005, Proceedings, Vol. 3709 of Lecture Notes in Computer Science, pp. 403 414. Springer. Li, C. M., Many a, F., & Planes, J. (2006). Detecting disjoint inconsistent subformulas for computing lower bounds for Max-SAT. In Proceedings, The Twenty-First National Conference on Artificial Intelligence and the Eighteenth Innovative Applications of Artificial Intelligence Conference, July 16-20, 2006, Boston, Massachusetts, USA, pp. 86 91. AAAI Press. Li, C., Xu, Z., Coll, J., Many a, F., Habet, D., & He, K. (2021). Combining clause learning and branch and bound for Max SAT. In Michel, L. D. (Ed.), 27th International Conference on Principles and Practice of Constraint Programming, CP 2021, Montpellier, France (Virtual Conference), October 25-29, 2021, Vol. 210 of LIPIcs, pp. 38:1 38:18. Schloss Dagstuhl - Leibniz-Zentrum f ur Informatik. Li, C., Xu, Z., Coll, J., Many a, F., Habet, D., & He, K. (2022). Boosting branch-and-bound Max SAT solvers with clause learning. AI Commun., 35(2), 131 151. Manthey, N., Philipp, T., & Steinke, P. (2014). A more compact translation of pseudoboolean constraints into CNF such that generalized arc consistency is maintained. In Lutz, C., & Thielscher, M. (Eds.), KI 2014: Advances in Artificial Intelligence - 37th Annual German Conference on AI, Stuttgart, Germany, September 22-26, 2014. Proceedings, Vol. 8736 of Lecture Notes in Computer Science, pp. 123 134. Springer. Marques-Silva, J., Lynce, I., & Malik, S. (2021). Conflict-driven clause learning SAT solvers. In Handbook of Satisfiability (2 edition)., Vol. 336 of Frontiers in Artificial Intelligence and Applications, pp. 133 182. IOS Press. Marques-Silva, J., & Planes, J. (2007). On using unsatisfiability for solving maximum satisfiability. Co RR, abs/0712.1097. Morgado, A., Dodaro, C., & Marques-Silva, J. (2014). Core-guided Max SAT with soft cardinality constraints. In O Sullivan, B. (Ed.), Principles and Practice of Constraint Programming - 20th International Conference, CP 2014, Lyon, France, September 812, 2014. Proceedings, Vol. 8656 of Lecture Notes in Computer Science, pp. 564 573. Springer. Morgado, A., Heras, F., Liffiton, M. H., Planes, J., & Marques-Silva, J. (2013). Iterative and core-guided Max SAT solving: A survey and assessment. Constraints An International Journal, 18(4), 478 534. Nadel, A. (2020). Anytime algorithms for Max SAT and beyond. In 2020 Formal Methods in Computer Aided Design, FMCAD 2020, Haifa, Israel, September 21-24, 2020, p. 1. IEEE. Narodytska, N., & Bacchus, F. (2014). Maximum satisfiability using core-guided Max SAT resolution. In Brodley, C. E., & Stone, P. (Eds.), Proceedings of the Twenty-Eighth AAAI Conference on Artificial Intelligence, July 27 -31, 2014, Qu ebec City, Qu ebec, Canada, pp. 2717 2723. AAAI Press. Unifying SAT-Based Approaches to Maximum Satisfiability Solving Narodytska, N., & Bjørner, N. S. (2022). Analysis of core-guided Max Sat using cores and correction sets. In Meel, K. S., & Strichman, O. (Eds.), 25th International Conference on Theory and Applications of Satisfiability Testing, SAT 2022, August 2-5, 2022, Haifa, Israel, Vol. 236 of LIPIcs, pp. 26:1 26:20. Schloss Dagstuhl - Leibniz-Zentrum f ur Informatik. Nieuwenhuis, R., Oliveras, A., & Tinelli, C. (2006). Solving SAT and SAT modulo theories: From an abstract Davis Putnam Logemann Loveland procedure to DPLL(t). Journal of the ACM, 53(6), 937 977. Ogawa, T., Liu, Y., Hasegawa, R., Koshimura, M., & Fujita, H. (2013). Modulo based CNF encoding of cardinality constraints and its application to Max SAT solvers. In 25th IEEE International Conference on Tools with Artificial Intelligence, ICTAI 2013, Herndon, VA, USA, November 4-6, 2013, pp. 9 17. IEEE Computer Society. Paxian, T., Reimer, S., & Becker, B. (2018). Dynamic polynomial watchdog encoding for solving weighted Max SAT. In Beyersdorff, O., & Wintersteiger, C. M. (Eds.), Theory and Applications of Satisfiability Testing - SAT 2018 - 21st International Conference, SAT 2018, Held as Part of the Federated Logic Conference, Flo C 2018, Oxford, UK, July 9-12, 2018, Proceedings, Vol. 10929 of Lecture Notes in Computer Science, pp. 37 53. Springer. Piotr ow, M. (2020). UWr Max Sat: Efficient solver for Max SAT and pseudo-boolean problems. In 32nd IEEE International Conference on Tools with Artificial Intelligence, ICTAI 2020, Baltimore, MD, USA, November 9-11, 2020, pp. 132 136. IEEE. Planes, J. (2003). Improved branch and bound algorithms for Max-2-SAT and weighted Max-2-SAT. In Rossi, F. (Ed.), Principles and Practice of Constraint Programming - CP 2003, 9th International Conference, CP 2003, Kinsale, Ireland, September 29 - October 3, 2003, Proceedings, Vol. 2833 of Lecture Notes in Computer Science, p. 991. Springer. Prestwich, S. D. (2021). CNF encodings. In Biere, A., Heule, M., van Maaren, H., & Walsh, T. (Eds.), Handbook of Satisfiability (2 edition)., Vol. 336 of Frontiers in Artificial Intelligence and Applications, pp. 75 100. IOS Press. Saikko, P., Berg, J., & J arvisalo, M. (2016). LMHS: A SAT-IP hybrid Max SAT solver. In Creignou, N., & Berre, D. L. (Eds.), Theory and Applications of Satisfiability Testing - SAT 2016 - 19th International Conference, Bordeaux, France, July 5-8, 2016, Proceedings, Vol. 9710 of Lecture Notes in Computer Science, pp. 539 546. Springer. Saikko, P., Dodaro, C., Alviano, M., & J arvisalo, M. (2018). A hybrid approach to optimization in answer set programming. In Thielscher, M., Toni, F., & Wolter, F. (Eds.), Principles of Knowledge Representation and Reasoning: Proceedings of the Sixteenth International Conference, KR 2018, Tempe, Arizona, 30 October - 2 November 2018, pp. 32 41. AAAI Press. Silva, J. P. M., & Sakallah, K. A. (1999). GRASP: A search algorithm for propositional satisfiability. IEEE Transactions on Computers, 48(5), 506 521. Sinz, C. (2005). Towards an optimal CNF encoding of boolean cardinality constraints. In van Beek, P. (Ed.), Principles and Practice of Constraint Programming - CP 2005, 11th Ihalainen, Berg, & J arvisalo International Conference, CP 2005, Sitges, Spain, October 1-5, 2005, Proceedings, Vol. 3709 of Lecture Notes in Computer Science, pp. 827 831. Springer. Smirnov, P., Berg, J., & J arvisalo, M. (2021). Pseudo-boolean optimization by implicit hitting sets. In Michel, L. D. (Ed.), 27th International Conference on Principles and Practice of Constraint Programming, CP 2021, Montpellier, France (Virtual Conference), October 25-29, 2021, Vol. 210 of LIPIcs, pp. 51:1 51:20. Schloss Dagstuhl - Leibniz-Zentrum f ur Informatik. Smirnov, P., Berg, J., & J arvisalo, M. (2022). Improvements to the implicit hitting set approach to pseudo-boolean optimization. In Meel, K. S., & Strichman, O. (Eds.), 25th International Conference on Theory and Applications of Satisfiability Testing, SAT 2022, August 2-5, 2022, Haifa, Israel, Vol. 236 of LIPIcs, pp. 13:1 13:18. Schloss Dagstuhl - Leibniz-Zentrum f ur Informatik. Tseitin, G. (1983). On the complexity of derivation in propositional calculus. In Siekmann, J. H., & Wrightson, G. (Eds.), Automation of Reasoning: 2: Classical Papers on Computational Logic 1967 1970, pp. 466 483. Springer Berlin Heidelberg, Berlin, Heidelberg. Warners, J. P. (1998). A linear-time transformation of linear inequalities into conjunctive normal form. Information Processing Letters, 68(2), 63 69. Zhang, L., Madigan, C. F., Moskewicz, M. W., & Malik, S. (2001). Efficient conflict driven learning in boolean satisfiability solver. In Ernst, R. (Ed.), Proceedings of the 2001 IEEE/ACM International Conference on Computer-Aided Design, ICCAD 2001, San Jose, CA, USA, November 4-8, 2001, pp. 279 285. IEEE Computer Society.