These days I'm working on routing algorithms and other ways to make
      travel safer, cleaner, and congestion-free.
	
      
Formerly, my research focus was on design, analysis, and experimental
      evaluation of algorithms for hard problems, in particular graph
      problems and discrete optimization problems, from various areas
      such as computational biology, VLSI design, and operations research. Further I
      am interested in compiler optimization and tuning of
      performance-critical code such as video codecs.
        
| 2021 | 
| Journal articles | 
| 
  
    Alexander Kröller,
      Falk Hüffner,
      Łukasz Kosma,
      Katja Kröller, and
      Mattia Zeni:Driver expectations toward strategic routing.
 Transportation Research Record 2675(11):44–53, 2021.
 [show abstract]
 Abstract.
      Strategic Routing is a traffic intervention mechanism. To reduce
      traffic in a certain area, drivers are asked to take a
      pre-defined route diverting them from the area, even though this
      increases their travel time. This can be implemented with
      navigation apps or in-dash navigation from navigation service
      providers such as TomTom or Google. Triggered by a traffic
      authority, the driver receives information through the service provider’s
      infrastructure and user interface about a new route and
      potentially a reward.
      In this work, we investigate the dynamics between traffic
      authority, service provider, and end user by analyzing user
      expectations. Ultimately, service providers are competing for
      customers. They can, therefore, only imple ment strategic routing
      if it appeals to drivers rather than scares them away. We
      report on the insights of a two-stage study with 457
      participants, exploring what kind of strategic routing
      interventions are appreciated by drivers.
      We find that drivers report a high interest in seeing diversion
      suggestions, even when they are not inclined to take
      them. However, they are unwilling to have their route adapted
      automatically. Important factors affecting drivers’ willingness to divert are
      the reason for the detour and the additional driving
      time. Incentives do increase the efficacy, but only marginally
      increase user appreciation, indicating that users may
      mistrust strategic routing that relies too strongly on
      incentives.
 | 
| 2020 | 
| Conference articles | 
| 
  
    Thomas Bläsius,
    Maximilian Böther,
    Philipp Fischbeck,
    Tobias Friedrich,
    Alina Gries,
    Falk Hüffner,
    Otto Kißig,
    Pascal Lenzner,
    Louise Molitor,
    Leon Schiller,
    Armin Wells, and
    Simon Wietheger:A strategic routing framework and algorithms for computing alternative paths.
 In Proceedings of the 20th Symposium on Algorithmic Approaches for Transportation Modelling, Optimization, and Systems
      (ATMOS ’20). September 2020.
 Volume 85 in OpenAccess Series in Informatics (OASIcs), pages 10:1–10:14, Schloss Dagstuhl–Leibniz-Zentrum für Informatik.
    [show abstract]
 Abstract.
      Traditional navigation services find the fastest route for a
      single driver. Though always using the fastest route seems
      desirable for every individual, selfish behavior can have
      undesirable effects such as higher energy consumption and
      avoidable congestion, even leading to higher overall and
      individual travel times. In contrast, strategic routing aims at
      optimizing the traffic for all agents regarding a global
      optimization goal. We introduce a framework to formalize
      real-world strategic routing scenarios as algorithmic problems
      and study one of them, which we call Single Alternative Path
      (SAP), in detail. There, we are given an original route between
      a single origin–destination pair. The goal is to suggest an
      alternative route to all agents that optimizes the overall
      travel time under the assumption that the agents distribute
      among both routes according to a psychological model, for which
      we introduce the concept of Pareto-conformity. We show that the
      SAP problem is NP-complete, even for such models. Nonetheless,
      assuming Pareto-conformity, we give multiple algorithms for
      different variants of SAP, using multi-criteria shortest path
      algorithms as subroutines. Moreover, we prove that several
      natural models are in fact Pareto-conform. The implementation of
      our algorithms serves as a proof of concept, showing that SAP
      can be solved in reasonable time even though the algorithms have
      exponential running time in the worst case.
 | 
| 2019 | 
| Patents | 
| 
  
    Timur Muraleev,
    Falk Hüffner,
    Alexander Kröller,
    Neil Clavin,
    Steffen Gunther Wiesner,
    Massimo Guggino,
    Michael Marte, and
    Henning Hasemann:Methods and systems for generating parking routes.
 Patent WO/2019/180004, 2019.
    [show abstract]
 Abstract.
      A method of determining a parking route for a vehicle travelling
      on a road network within a geographic area is disclosed. A
      sub-network is determined comprising a subset of segments of an
      electronic map that are representative of roads within a
      predetermined walking time or distance of a destination
      location. Data indicative of a walking time or distance from the
      segment to the destination location is associated, at least with
      the segments of the sub-network representing roads having at
      least one associated parking space. A search algorithm having an
      associated cost function is used to explore the segments of the
      sub-network from an origin location to identify a plurality of
      candidate parking routes, wherein the cost for a given parking
      route is based on the probability of the vehicle successfully
      finding a parking space on the parking route and the expected
      cumulative travel and walking time or distance to the
      destination location should a parking space be found along the
      parking route.
 | 
| 2018 | 
| Journal articles | 
| 
  
      Robert Bredereck,
      Jiehua Chen,
      Falk Hüffner, and
      Stefan Kratsch:Parameterized complexity of team formation in social networks.
 Theoretical Computer Science 717:26–36, 2018
      (original publication).
    [show abstract]
 Abstract.
      Given a task that requires some skills and a social network of
      individuals with different skills, the Team Formation problem
      asks to find a team of individuals that together can perform the
      task, while minimizing communication costs. Since the problem is
      NP-hard, we identify the source of intractability by analyzing
      its parameterized complexity with respect to parameters such as
      the total number of skills k, the team size l, the
      communication cost budget b, and the maximum vertex
      degree Δ. We show that the computational complexity
      strongly depends on the communication cost measure: when using
      the weight of a minimum spanning tree of the subgraph formed by
      the selected team, we obtain fixed-parameter tractability for
      example with respect to the parameter k. In contrast,
      when using the diameter as measure, the problem is intractable
      with respect to any single parameter; however,
      combining Δ with either b or l yields
      fixed-parameter tractability.
  
    Guillaume Fertin, 
      Falk Hüffner,
      Christian Komusiewicz, and
      Manuel Sorge:Matching algorithms for assigning orthologs after genome duplication events.
 Computational Biology and Chemistry 74:379–390, 2018
      (original publication).
    [show abstract]
 Abstract.
      In this paper, we introduce and analyze two graph-based models
      for assigning orthologs in the presence of whole-genome
      duplications, using similarity information between pairs of
      genes. The common feature of our two models is that genes of the
      first genome may be assigned two orthologs from the second
      genome, which has undergone a whole-genome
      duplication. Additionally, our models incorporate the new notion
      of duplication bonus, a parameter that reflects how assigning
      two orthologs to a given gene should be rewarded or
      penalized. Our work is mainly focused on developing exact and
      reasonably time-consuming algorithms for these two models: we
      show that the first one is polynomial-time solvable, while the
      second is NP-hard. For the latter, we thus design two
      fixed-parameter algorithms, i.e. exact algorithms whose running
      times are exponential only with respect to a small and
      well-chosen input parameter. Finally, for both models, we
      evaluate our algorithms on pairs of plant genomes. Our
      experiments show that the NP-hard model yields a better cluster
      quality at the cost of lower coverage, due to the fact that our
      instances cannot be completely solved by our
      algorithms. However, our results are altogether encouraging and
      show that our methods yield biologically significant predictions
      of orthologs when the duplication bonus value is properly
      chosen.
 | 
| 2017 | 
| Journal articles | 
| 
  
    René van Bevern, 
      Robert Bredereck,
      Morgan Chopin,
      Sepp Hartung,
      Falk Hüffner,
      André Nichterlein, and
      Ondřej Suchý:Fixed-parameter algorithms for DAG partitioning.
 Discrete Applied Mathematics 220:134–160, 2017
      (original publication).
    [show abstract]
 Abstract.
      Finding the origin of short phrases propagating through the web
      has been formalized by Leskovec et al. (2009)
      as DAG Partitioning: given an
      arc-weighted directed acyclic graph on n vertices and m arcs,
      delete arcs with total weight at most k such that each
      resulting weakly-connected component contains exactly one sink—a
      vertex without outgoing arcs. DAG Partitioning is NP-hard.
      We show an algorithm to solve DAG Partitioning in
      O(2k · (n + m)) time,
      that is, in linear time for fixed k. We complement it
      with linear-time executable data reduction rules. Our
      experiments show that, in combination, they can optimally
      solve DAG Partitioning on simulated
      citation networks within five minutes for k ≤ 190
      and m being 107 and larger. We use our
      obtained optimal solutions to evaluate the solution quality of
      Leskovec et al.’s heuristic.  We show that Leskovec et al.’s
      heuristic works optimally on trees and generalize this result by
      showing that DAG Partitioning is
      solvable in 2O(t2)
      · n time if a width-t tree decomposition of the
      input graph is given. Thus, we improve an algorithm and answer
      an open question of Alamdari and Mehrabian (2012).  We
      complement our algorithms by lower bounds on the running time of
      exact algorithms and on the effectivity of data reduction.
 | 
| Book chapters | 
|  | 
| 2016 | 
| Conference articles | 
| 
  
    Robert Bredereck,
      Jiehua Chen,
      Falk Hüffner, and
      Stefan Kratsch:Parameterized complexity of team formation in social networks.
 In Proceedings of the 11th International Conference on Algorithmic Aspects in Information and Management
      (AAIM ’16),
      Bergamo, Italy. July 2016.
 Volume 9778 in Lecture Notes in Computer Science, pages 137–149,
      Springer, 2016 (original publication).
    Journal version available.
    [show abstract]
 Abstract.
      Given a task that requires some skills and a social network of
      individuals with different skills, the Team
      Formation problem asks to find a team of individuals that
      together can perform the task, while minimizing communication
      costs. Since the problem is NP-hard, we identify the source of
      intractability by analyzing its parameterized complexity with
      respect to parameters such as the total number of
      skills k, the team size l, the communication cost
      budget b, and the maximum vertex degree Δ. We show
      that the computational complexity strongly depends on the
      communication cost measure: when using the weight of a minimum
      spanning tree of the subgraph formed by the selected team, we
      obtain fixed-parameter tractability for example with respect to
      the parameter k. In contrast, when using the diameter as
      measure, the problem is intractable with respect to any single
      parameter; however, combining Δ with either b
      or l yields fixed-parameter tractability.
 | 
| 2015 | 
| Conference articles | 
| 
  
    Falk Hüffner,
      Christian Komusiewicz, and
      André Nichterlein:Editing graphs into few cliques: complexity, approximation, and kernelization schemes.
 In Proceedings of the 14th Algorithms and Data Structures Symposium
      (WADS ’15),
      Victoria, Canada. August 2015.
 Volume 9214 in Lecture Notes in Computer Science, pages 410–421,
      Springer, 2015 (original publication).
    [show abstract]
 Abstract.
      Given an undirected graph G and a positive
      integer k, the Sparse Split Graph
      Editing problem asks to transform G into a graph
      that consists of a clique plus isolated vertices by performing
      at most k edge insertions and deletions; similarly,
      the P3-Bag Editing
      problem asks to transform G into a graph which is the
      union of two possibly overlapping cliques.
      Sparse Split Graph Editing is
      NP-complete; we give a simple linear-time 3-approximation
      algorithm, an improvement over a more involved known
      factor-3.525 approximation. Further, we show
      that P3-Bag Editing is
      also NP-complete.  Finally, we present a kernelization scheme
      for both problems and additionally for the
      2-Cluster Editing problem. This scheme
      produces for each fixed ϵ in polynomial time a kernel of
      size ϵk. This is, to the best of our knowledge, the first
      example of a kernelization scheme that converges to a lower
      bound.
  
    Falk Hüffner,
      Christian Komusiewicz, and
      Manuel Sorge:Finding highly connected subgraphs.
 In Proceedings of the 41st International Conference on Current Trends in Theory and Practice of Computer Science
      (SOFSEM ’15),
      Pec pod Sněžkou, Czech Republic. January 2015.
 Volume 8939 in Lecture Notes in Computer Science, pages 254–265,
      Springer, 2015 (original publication).
    [show abstract]
 Abstract.
      A popular way of formalizing clusters in networks are
      highly connected subgraphs, that is, subgraphs
      of k vertices that have edge connectivity larger
      than k/2 (equivalently, minimum degree larger
      than k/2). We examine the computational complexity of
      finding highly connected subgraphs. We first observe that this
      problem is NP-hard. Thus, we explore possible parameterizations,
      such as the solution size, number of vertices in the input, the
      size of a vertex cover in the input, and the number of edges
      outgoing from the solution (edge isolation), and expose their
      influence on the complexity of this problem. For some
      parameters, we find strong intractability results; among the
      parameters yielding tractability, the edge isolation seems to
      provide the best trade-off between running time bounds and a
      small parameter value in relevant instances.
 | 
| Journal articles | 
| 
  
    René van Bevern,
    Jiehua Chen,
    Falk Hüffner,
    Stefan Kratsch,
    Nimrod Talmon, and
    Gerhard J. Woeginger:Approximability and parameterized complexity of multicover by c-intervals.
 Information Processing Letters 115(10):744–749,
      2015
      (original publication).
    [show abstract]
 Abstract.
      A c-interval is the disjoint union of c intervals
      over ℕ. The c-Interval Multicover
      problem is the special case of Set
      Multicover where all sets available for covering
      are c-intervals. We strengthen known APX-hardness results
      for c-Interval Multicover, show
      W[1]-hardness when parameterized by the solution size, and
      present fixed-parameter algorithms for alternative
      parameterizations.
  
    Sharon Bruckner,
      Falk Hüffner, and
      Christian Komusiewicz:A graph modification approach for finding core–periphery structures in protein interaction networks.
 Algorithms for Molecular Biology 10:16,
      2015.
    [show abstract]
 Abstract.
      The core–periphery model for protein interaction (PPI) networks
      assumes that protein complexes in these networks consist of a
      dense core and a possibly sparse periphery that is adjacent to
      vertices in the core of the complex. In this work, we aim at
      uncovering a global core–periphery structure for a given PPI
      network. We propose two exact graph-theoretic formulations for
      this task, which aim to fit the input network to a hypothetical
      ground truth network by a minimum number of edge
      modifications. In one model each cluster has its own periphery,
      and in the other the periphery is shared. We first analyze both
      models from a theoretical point of view, showing their
      NP-hardness. Then, we devise efficient exact and heuristic
      algorithms for both models and finally perform an evaluation on
      subnetworks of the S. cerevisiae PPI network.
  
    Falk Hüffner,
      Christian Komusiewicz,
      Rolf Niedermeier, and
      Martin Rötzschke:The parameterized complexity of the rainbow subgraph problem.
 Algorithms 8(1):60–81, 2015
      (original publication).
    [show abstract]
 Abstract.
      The NP-hard Rainbow Subgraph problem,
      motivated from bioinformatics, is to find in an edge-colored
      graph a subgraph that contains each edge color exactly once and
      has at most k vertices. We examine the parameterized
      complexity of Rainbow Subgraph for
      paths, trees, and general graphs. We show
      that Rainbow Subgraph is W[1]-hard with
      respect to the parameter k and also with respect to the
      dual parameter l := n − k
      where n is the number of vertices. Hence, we examine
      parameter combinations and show, for example, a polynomial-size
      problem kernel for the combined parameter l and “maximum
      number of colors incident with any vertex”.
 | 
| 2014 | 
| Conference articles | 
| 
  
    Sharon Bruckner,
      Falk Hüffner, and
      Christian Komusiewicz:A graph modification approach for finding core–periphery structures in protein interaction networks.
 In Proceedings of the 14th Workshop on Algorithms in Bioinformatics
      (WABI ’14),
      Wrocław, Poland. September 2014.
 Volume 8701 in Lecture Notes in Bioinformatics, pages 340–351,
      Springer, 2014 (original publication).
    Journal version available.
    [show abstract]
 Abstract.
      The core–periphery model for protein interaction (PPI) networks
      assumes that protein complexes in these networks consist of a
      dense core and a possibly sparse periphery that is adjacent to
      vertices in the core of the complex. In this work, we aim at
      uncovering a global core–periphery structure for a given PPI
      network. We propose two exact graph-theoretic formulations for
      this task, which aim to fit the input network to a hypothetical
      ground truth network by a minimum number of edge
      modifications. In one model each cluster has its own periphery,
      and in the other the periphery is shared.  We first analyze both
      models from a theoretical point of view, showing their
      NP-hardness. Then, we devise efficient exact and heuristic
      algorithms for both models and finally perform an evaluation on
      subnetworks of the S. cerevisiae PPI network.
  
    Falk Hüffner,
      Christian Komusiewicz,
      Rolf Niedermeier, and
      Martin Rötzschke:The parameterized complexity of the rainbow subgraph problem.
 In Proceedings of the 40th International Workshop on Graph-Theoretic Concepts in Computer Science
      (WG ’14),
      Le Domaine de Chalès, France, June 2014.
 Volume 8747 in Lecture Notes in Computer Science, pages 287–298,
      Springer, 2014 (original publication).
    Journal version available.
    [show abstract]
 Abstract.
      The Rainbow Subgraph problem, motivated
      from bioinformatics, is to find in an edge-colored graph a
      subgraph with a minimum number of vertices that contains all
      colors. We examine the parameterized complexity
      of Rainbow Subgraph for paths, trees,
      and general graphs. We show APX-hardness even on properly
      edge-colored paths where every color occurs at most twice, and
      that no single natural parameter yields tractability. Hence, we
      examine parameter combinations and obtain e.g. tractability and
      a kernel for the combination of “number of vertices not in the
      solution” and “maximum number of colors incident on a vertex”.
 | 
| Journal articles | 
| 
  
    Laurent Bulteau,
      Falk Hüffner,
      Christian Komusiewicz, and
      Rolf Niedermeier:Multivariate algorithmics for NP-hard string problems.
 Bulletin of the EATCS 114:31–73, 2014
      (original publication).
 Note: Compared to the published version, this version
    fixes typos in Figure 1 and in Challenge 13.
 [show abstract]
 Abstract.
      String problems arise in various applications ranging from text
      mining to biological sequence analysis. Although string problems
      are often considered to be easier than graph problems, many
      string problems are NP-hard as well. This motivates the search
      for (fixed-parameter) tractable special cases of these
      problems. We survey the state of the art concerning
      parameterized algorithmics for NP-hard string problems and
      identify challenges for future research.
  
    Falk Hüffner,
      Christian Komusiewicz,
      Adrian Liebtrau, and
      Rolf Niedermeier:Partitioning biological networks into highly connected clusters with maximum edge coverage.
 IEEE/ACM Transactions on Computational Biology and Bioinformatics 11(3):455–467, 2014
      (original publication).
    [show abstract]
 Abstract.
      A popular clustering algorithm for biological networks which was
      proposed by Hartuv and Shamir [IPL 2000] identifies
      nonoverlapping highly connected components. We extend the
      approach taken by this algorithm by introducing the
      combinatorial optimization problem Highly
      Connected Deletion, which asks for removing as few edges
      as possible from a graph such that the resulting graph consists
      of highly connected components. We show
      that Highly Connected Deletion is
      NP-hard and provide a fixed-parameter algorithm and a
      kernelization. We propose exact and heuristic solution
      strategies, based on polynomial-time data reduction rules and
      integer linear programming with column generation. The data
      reduction typically identifies 75 % of the edges that are deleted
      for an optimal solution; the column generation method can then
      optimally solve protein interaction networks with up to
      6 000 vertices and 13 500 edges in less than a
      day. Additionally, we present a new heuristic that finds more
      clusters than the method by Hartuv and Shamir.
 | 
| 2013 | 
| Conference articles | 
| 
  
    René van Bevern, 
    Robert Bredereck,
    Morgan Chopin,
    Sepp Hartung,
    Falk Hüffner,
    André Nichterlein, and
    Ondřej Suchý:Parameterized complexity of DAG partitioning.
 In Proceedings of the 8th International Conference on Algorithms and Complexity
      (CIAC ’13),
      Barcelona, Spain. May 2013.
 Volume 7878 in Lecture Notes in Computer Science, pages 49–60,
      Springer, 2013 (original publication).
    [show abstract]
 Abstract.
      The goal of tracking the origin of short, distinctive phrases
      (memes) that propagate through the web in reaction to current events
      has been formalized as DAG Partitioning: given a
      directed acyclic graph, delete edges of minimum weight such that each
      resulting connected component of the underlying undirected graph contains
      only one sink.  Motivated by NP-hardness and hardness of approximation
      results, we consider the parameterized complexity of this problem. We show
      that it can be solved in O(2k · n2) time,
      where k is the number of edge deletions, proving fixed-parameter
      tractability for parameter k. We then show that unless the Exponential
      Time Hypothesis (ETH) fails, this cannot be improved to
      2o(k) · nO(1);
      further, DAG Partitioning does not have a polynomial
      kernel unless NP ⊆ coNP/poly. Finally, given a tree decomposition of
      width w, we show how to solve DAG Partitioning
      in 2O(w2) · n time, improving a
      known algorithm for the parameter pathwidth.
  
    Sharon Bruckner,
      Falk Hüffner,
      Christian Komusiewicz, and
      Rolf Niedermeier:Evaluation of ILP-based approaches for partitioning into colorful components.
 In Proceedings of the 12th International Symposium on Experimental Algorithms
      (SEA ’13),
      Rome, Italy. June 2013.
 Volume 7933 in Lecture Notes in Computer Science, pages 176–187,
      Springer, 2013 (original publication).
    [show abstract]
 Abstract.
      The NP-hard Colorful Components problem
      is a graph partitioning problem on vertex-colored graphs. We
      identify a new application of Colorful
      Components in the correction of Wikipedia interlanguage
      links. We then describe and compare several exact and heuristic
      approaches for solving Colorful
      Components. In particular, we devise two ILP
      formulations, where one employs a relationship to the
      Hitting Set problem and the other one
      uses a Clique Partition formulation.
      Furthermore, we use the recently proposed implicit hitting set
      framework [Karp, JCSS 2011, Chandrasekaran et al., SODA 2011] to
      solve Colorful Components. Finally, we
      also study a move-based and a merge-based heuristic for
      Colorful Components. Our approach can
      solve the Colorful Components model of
      Wikipedia link correction optimally; while the
      Clique Partition-based ILP outperforms
      the other two approaches, the implicit hitting set is a simple
      approach that delivers competitive results. Finally, the
      merge-based heuristic is very accurate and outperforms the
      move-based one.
 
    Falk Hüffner,
      Christian Komusiewicz,
      Adrian Liebtrau, and
      Rolf Niedermeier:Partitioning biological networks into highly connected clusters with maximum edge coverage.
 In Proceedings of the 9th International Symposium on Bioinformatics Research and Applications
      (ISBRA ’13),
      Charlotte, North Carolina. May 2013.
 Volume 7875 in Lecture Notes in Computer Science, pages 99–111,
      Springer, 2013 (original publication).
    Journal version available.
    [show abstract]
 Abstract.
      We introduce the combinatorial optimization problem
      Highly Connected Deletion, which
      asks for removing as few edges as possible from a graph such
      that the resulting graph consists of highly connected
      components. We show that Highly Connected
      Deletion is NP-hard and provide a fixed-parameter
      algorithm and a kernelization. We propose exact and heuristic
      solution strategies, based on polynomial-time data reduction
      rules and integer linear programming with column generation. The
      data reduction typically identifies 85% of the edges
      that need to be deleted for an optimal solution; the column
      generation method can then optimally solve protein interaction
      networks with up to 5 000 vertices and
      12 000 edges.
 | 
| Journal articles | 
| 
  
    Hartmut Ehrig,
      Claudia Ermel,
      Falk Hüffner,
      Rolf Niedermeier, and 
      Olga Runge:Confluence in data reduction: bridging graph transformation and kernelization.
 Computability 2(1):31–49, 2013
      (original publication).    
    [show abstract]
 Abstract.
      Kernelization is a core tool of parameterized algorithmics for
      coping with computationally intractable problems. A
      kernelization reduces in polynomial time an input
      instance to an equivalent instance whose size is bounded by a
      function only depending on some problem-specific
      parameter k; this new instance is called problem
      kernel. Typically, problem kernels are achieved by performing
      efficient data reduction rules. So far, there was little
      systematic study in the literature concerning the mutual
      interaction of data reduction rules, in particular whether data
      reduction rules for a specific problem always lead to the same
      reduced instance, no matter in which order the rules are
      applied. This corresponds to the concept of confluence from the
      theory of rewriting systems. We argue that it is valuable to
      study whether a kernelization is confluent, using the NP-hard
      graph problems (Edge) Clique Cover
      and Partial Clique Cover as running
      examples.  We apply the concept of critical pair analysis from
      graph transformation theory, supported by the AGG software tool.
      These results support the main goal of our work, namely, to
      establish a fruitful link between (parameterized) algorithmics
      and graph transformation theory, two so far unrelated fields.
 | 
| 2012 | 
| Conference articles | 
| 
  
    Sharon Bruckner,
      Falk Hüffner,
      Christian Komusiewicz,
      Rolf Niedermeier,
      Sven Thiel, and
      Johannes Uhlmann:Partitioning into colorful components by minimum edge deletions.
 In Proceedings of the 23rd Annual Symposium on Combinatorial Pattern Matching
      (CPM ’12),
      Helsinki, Finland. July 2012.
 Volume 7354 in Lecture Notes in Computer Science, pages 56–69,
      Springer, 2012 (original publication).
    [show abstract]
 Abstract.
      The NP-hard Colorful Components problem is, given a
      vertex-colored graph, to delete a minimum number of edges such that no
      connected component contains two vertices of the same color.  It has
      applications in multiple sequence alignment and in multiple network alignment
      where the colors correspond to species. We initiate a systematic
      complexity-theoretic study of Colorful Components by
      presenting NP-hardness as well as fixed-parameter tractability results for
      different variants of Colorful Components.  We also
      perform experiments with our algorithms and additionally develop an efficient
      and very accurate heuristic algorithm clearly outperforming a previous
      min-cut-based heuristic on multiple sequence alignment data.
  
    Hartmut Ehrig,
      Claudia Ermel,
      Falk Hüffner,
      Rolf Niedermeier, and
      Olga Runge:Confluence in data reduction: bridging graph transformation and kernelization.
 In Proceedings of the 8th Conference on Computability in Europe
      (CiE ’12),
      Cambridge, UK. June 2012.
 Volume 7318 in Lecture Notes in Computer Science, pages 193–202,
      Springer, 2012 (original publication).
    Journal version available.
    [show abstract]
 Abstract.
      Kernelization is a core tool of parameterized algorithmics for coping with
      computationally intractable problems. A kernelization reduces in polynomial time an
      input instance to an equivalent instance whose size is bounded by a function only
      depending on some problem-specific parameter k; this new instance is called
      problem kernel. Typically, problem kernels are achieved by performing efficient data
      reduction rules. So far, there was little study in the literature concerning the mutual
      interaction of data reduction rules; in particular, it seems unstudied whether data
      reduction rules for a specific problem always lead to the same reduced instance, no
      matter in which order the rules are applied. This corresponds to the concept of
      confluence from rewriting system theory. We argue that it is interesting to study
      whether a kernelization is confluent.  To this end, we use the NP-hard graph
      problem (Edge) Clique Cover as a running example, which asks
      for the smallest set of cliques in a graph such that each edge is contained in at least
      one clique. We provide a linear-time confluent kernelization for this problem and
      demonstrate how to apply the concept of critical pair analysis from graph
      transformation theory to achieve this, supported by the AGG software tool. We
      generalize our investigations to the Partial Clique Cover
      problem, which requires a more involved analysis. These results support the main goal
      of our work, namely, to establish a link between (parameterized) algorithmics and graph
      transformation theory, two so far unrelated fields whose interaction promises to be
      beneficial for both sides.
 | 
| 2011 | 
| Conference articles | 
| 
  
    Antonios Antoniadis,
      Falk Hüffner,
      Pascal Lenzner,
      Carsten Moldenhauer, and
      Alexander Souza:Balanced interval coloring.
 In Proceedings of the 28th Symposium on Theoretical Aspects of Computer Science
      (STACS ’11),
      Dortmund, Germany. March 2011.
 Volume 9 in Leibniz International Proceedings in Informatics (LIPIcs), pages 531–542, Schloss Dagstuhl–Leibniz-Zentrum für Informatik
      (original publication).
    [show abstract]
 Abstract.
      We consider the discrepancy problem of coloring n
      intervals with k colors such that at each point on the
      line, the maximal difference between the number of intervals of
      any two colors is minimal. Somewhat surprisingly, a coloring
      with maximal difference at most one always exists. Furthermore,
      we give an algorithm with running time O(n log
      n + kn log k) for its construction. This is
      in particular interesting because many known results for
      discrepancy problems are non-constructive. This problem
      naturally models a load balancing scenario, where n tasks
      with given start- and endtimes have to be distributed among
      k servers. Our results imply that this can be done
      ideally balanced.When generalizing to d-dimensional boxes (instead of
      intervals), a solution with difference at most one is not always
      possible. We show that it is NP-complete to decide this for any
      d ≥ 2 and any k ≥ 2, which implies also
      NP-hardness of the respective minimization problem.
 In an online scenario, where intervals arrive over time and the
      color has to be decided upon arrival, the maximal difference in
      the size of color classes can become arbitrarily high for any
      online algorithm.
    Britta Dorn, 
      Falk Hüffner,
      Dominikus Krüger,
      Rolf Niedermeier, and
      Johannes Uhlmann:Exploiting bounded signal flow for graph orientation based on cause–effect pairs.
 In Proceedings of the 1st International ICST Conference on Theory and Practice of Algorithms
		  in (Computer) Systems
      (TAPAS ’11),
      Rome, Italy. April 2011.
 Volume 6595 in Lecture Notes in Computer Science, pages 104–115, Springer
      (original publication).
    Journal version available.
    [show abstract]
 Abstract.
        We consider the following problem: Given an undirected network
        and a set of sender–receiver pairs, direct all edges
        such that the maximum number of “signal flows”
        defined by the pairs can be routed respecting edge
        directions. This problem has applications in communication
        networks and in understanding protein interaction based cell
        regulation mechanisms.  Since this problem is NP-hard,
        research so far concentrated on polynomial-time approximation
        algorithms and tractable special cases.  We take the viewpoint
        of parameterized algorithmics and examine several parameters
        related to the maximum signal flow over vertices or edges. We
        provide several fixed-parameter tractability results, and in
        one case a sharp complexity dichotomy between a linear-time
        solvable case and a slightly more general NP-hard case. We
        examine the value of these parameters for several real-world
        network instances.  For many relevant cases, the NP-hard
        problem can be solved to optimality. In this way,
        parameterized analysis yields both deeper insight into the
        computational complexity and practical solving strategies.
 | 
| Journal articles | 
| 
  
    Britta Dorn, 
      Falk Hüffner,
      Dominikus Krüger,
      Rolf Niedermeier, and
      Johannes Uhlmann:Exploiting bounded signal flow for graph orientation based on cause–effect pairs.
 Algorithms for Molecular Biology 6(1):21, 2011
      (original publication).
    [show abstract]
 Abstract.
      Background: We consider the following problem: Given an
      undirected network and a set of sender–receiver pairs, direct
      all edges such that the maximum number of “signal flows” defined
      by the pairs can be routed respecting edge directions. This
      problem has applications in understanding protein interaction
      based cell regulation mechanisms.  Since this problem is
      NP-hard, research so far concentrated on polynomial-time
      approximation algorithms and tractable special cases.Results: We take the viewpoint of parameterized
      algorithmics and examine several parameters related to the
      maximum signal flow over vertices or edges. We provide several
      fixed-parameter tractability results, and in one case a sharp
      complexity dichotomy between a linear-time solvable case and a
      slightly more general NP-hard case. We examine the value of
      these parameters for several real-world network instances.
 Conclusions: Several biologically relevant special cases
      of the NP-hard problem can be solved to optimality. In this way,
      parameterized analysis yields both deeper insight into the
      computational complexity and practical solving strategies.
 | 
| Book chapters | 
|  | 
| 2010 | 
| Journal articles | 
| 
  
    Sharon Bruckner,
      Falk Hüffner,
      Richard M. Karp,
      Ron Shamir, and
      Roded Sharan:Topology-free querying of protein interaction networks.
 Journal of Computational Biology 17(3):237–252, 2010
      (original publication).
    [show abstract]
 Abstract.
      In the network querying problem, one is given a protein complex
      or pathway of species A and a protein–protein
      interaction network of species B; the goal is to
      identify subnetworks of B that are similar to the
      query in terms of sequence, topology, or both. Existing
      approaches mostly depend on knowledge of the interaction
      topology of the query in the network of species A;
      however, in practice, this topology is often not known. To
      address this problem, we develop a topology-free querying
      algorithm, which we call Torque. Given a
      query, represented as a set of proteins, Torque seeks a matching set of proteins that
      are sequence-similar to the query proteins and span a connected
      region of the network, while allowing both insertions and
      deletions. The algorithm uses alternatively dynamic programming
      and integer linear programming for the search task. We test
      Torque with queries from yeast, fly, and
      human, where we compare it to the QNet topology-based approach,
      and with queries from less studied species, where only
      topology-free algorithms apply. Torque
      detects many more matches than QNet, while giving results that
      are highly functionally coherent.
    
    Michael Dom,
      Jiong Guo,
      Falk Hüffner,
      Rolf Niedermeier, and
      Anke Truss:Fixed-parameter tractability results for feedback set problems in tournaments.
 Journal of Discrete Algorithms 8(1):76–86, 2010
      (original publication).
    [show abstract]
 Abstract.
      Complementing recent progress on classical complexity and
      polynomial-time approximability of feedback set problems in
      (bipartite) tournaments, we extend and improve fixed-parameter
      tractability results for these problems.  We show that Feedback Vertex Set in tournaments (FVST) is
      amenable to the novel iterative compression technique, and we
      provide a depth-bounded search tree for Feedback Arc Set in bipartite tournaments
      based on a new forbidden subgraph characterization.  Moreover,
      we apply the iterative compression technique to d-Hitting Set, which generalizes Feedback Vertex Set in tournaments, and obtain
      improved upper bounds for the time needed to solve 4-Hitting Set and 5-Hitting
      Set.  Using our parameterized algorithm for Feedback Vertex Set in tournaments, we also
      give an exact (not parameterized) algorithm for it running in
      O(1.709n) time, where n is the number
      of input graph vertices, answering a question of Woeginger
      [Discrete Appl. Math. 156(3):397–405, 2008].
  
    Falk Hüffner,
      Christian Komusiewicz, 
      Hannes Moser, and
      Rolf Niedermeier:Fixed-parameter algorithms for cluster vertex deletion.
 Theory of Computing Systems 47(1):196–217, 2010
      (original publication).
    [show abstract]
 Abstract.
      We initiate the first systematic study of the NP-hard Cluster Vertex Deletion (CVD) problem (unweighted and
      weighted) in terms of fixed-parameter algorithmics. In the unweighted
      case, one searches for a minimum number of vertex deletions to transform a
      graph into a collection of disjoint cliques. The parameter is the number
      of vertex deletions.  We present efficient fixed-parameter algorithms for
      CVD applying the fairly new iterative compression technique. Moreover, we
      study the variant of CVD where the maximum number of cliques to be
      generated is prespecified. Here, we exploit connections to fixed-parameter
      algorithms for (weighted) Vertex Cover.
  
    Falk Hüffner,
      Nadja Betzler, and
      Rolf Niedermeier:Separator-based data reduction for signed graph balancing.
 Journal of Combinatorial Optimization 20(4):335–360, 2010
      (original publication).
    [show abstract]
 Abstract.
      Polynomial-time data reduction is a classical approach to hard
      graph problems. Typically, particular small subgraphs are
      replaced by smaller gadgets. We generalize this approach to
      handle any small subgraph that has a small separator connecting
      it to the rest of the graph. The problem we study is the NP-hard
      Balanced Subgraph problem, which asks
      for a 2-coloring of a graph that minimizes the inconsistencies
      with given edge labels. It has applications in social networks,
      systems biology, and integrated circuit design. The data
      reduction scheme unifies and generalizes a number of previously
      known data reductions, and can be applied to a large number of
      graph problems where a coloring or a subset of the vertices is
      sought. To solve the instances that remain after reduction, we
      use a fixed-parameter algorithm based on iterative compression
      with a very effective heuristic speedup. Our implementation can
      solve biological real-world instances exactly for which
      previously only approximations were known. In addition, we
      present experimental results for financial networks and random
      networks.
     | 
| Book chapters | 
|  | 
| 2009 | 
| Conference articles | 
| 
  
    Sebastian Böcker,
      Falk Hüffner,
      Anke Truss, and
      Magnus Wahlström:A faster fixed-parameter approach to drawing binary tanglegrams.
 In Proceedings of the 4th International Workshop on Parameterized and Exact Computation
      (IWPEC ’09),
      Copenhagen, Denmark. September 2009.
 Volume 5917 in Lecture Notes in Computer Science, pages 38–49, Springer
      (original publication).
    [show abstract]
 Abstract.
      Given two binary phylogenetic trees covering the same n
      species, it is useful to compare them by drawing them with
      leaves arranged side-by-side. To facilitate comparison, we would
      like to arrange the trees to minimize the number of crossings
      k induced by connecting pairs of identical species. This
      is the NP-hard Tanglegram Layout
      problem.  By providing a fast transformation to the Balanced Subgraph problem, we show that the
      problem admits an O(2k
      n4) algorithm, improving upon a
      previous fixed-parameter approach with running time
      O(ck
      nO(1)) where c ≈ 1000.  We
      enhance a Balanced Subgraph
      implementation based on data reduction and iterative compression
      with improvements tailored towards these instances, and run
      experiments with real-world data to show the practical
      applicability of this approach. All practically relevant
      (k ≤ 1000) Tanglegram Layout
      instances can be solved exactly within seconds. Additionally, we
      provide a kernel-like bound by showing how to reduce the Balanced Subgraph instances for Tanglegram Layout on complete binary trees to
      a size of O(k log k).
    
    Sharon Bruckner,
      Falk Hüffner,
      Richard M. Karp,
      Ron Shamir, and
      Roded Sharan:Topology-free querying of protein interaction networks.
 In Proceedings of the 13th Annual International Conference on Research in Computational Molecular Biology
      (RECOMB ’09),
      Tucson, Arizona, USA. May 2009.
 Volume 5541 in Lecture Notes in Bioinformatics, pages 74–89, Springer
      (original publication).
    Journal version available.
    [show abstract]
 Abstract.
      In the network querying problem, one is given a protein complex
      or pathway of species A and a protein-protein
      interaction network of species B; the goal is to
      identify subnetworks of B that are similar to the
      query. Existing approaches mostly depend on knowledge of the
      interaction topology of the query in the network of
      species A; however, in practice, this topology is
      often not known.  To combat this problem, we develop a
      topology-free querying algorithm, which we call Torque. Given a query, represented as a set of
      proteins, Torque seeks a matching set of
      proteins that are sequence-similar to the query proteins and
      span a connected region of the network, while allowing both
      insertions and deletions. The algorithm uses alternatively
      dynamic programming and integer linear programming for the
      search task. We test Torque with queries
      from yeast, fly, and human, where we compare it to the QNet
      topology-based approach, and with queries from less studied
      species, where only topology-free algorithms apply. Torque detects many more matches than QNet,
      while in both cases giving results that are highly functionally
      coherent.
     | 
| Journal articles | 
| 
  
    Sharon Bruckner,
      Falk Hüffner,
      Richard M. Karp,
      Ron Shamir, and
      Roded Sharan:Torque: Topology-free querying of protein interaction networks.
 Nucleic Acids Research 37(Web Server issue):W106–W108, 2009
      (original publication).
    [show abstract]
 Abstract.
      Torque is a tool for cross-species
      querying of protein–protein interaction networks. It aims to
      answer the following question: given a set of proteins
      constituting a known complex or a pathway in one species, can a
      similar complex or pathway be found in the protein network of
      another species? To this end, Torque
      seeks a matching set of proteins that are sequence-similar to
      the query proteins and span a connected region of the target
      network, while allowing for both insertions and
      deletions. Unlike existing approaches, Torque does not require knowledge of the
      interconnections among the query proteins.  It can handle large
      queries of up to 25 proteins.
    
    Falk Hüffner:Algorithm engineering for optimal graph bipartization.
 Journal of Graph Algorithms and Applications, 13(2):77–98, 2009
      (original publication).
    [show abstract]
 Abstract.
      We examine exact algorithms for the NP-hard Graph
      Bipartization problem. The task is, given a graph, to find a
      minimum set of vertices to delete to make it bipartite. Based on the
      “iterative compression” method introduced by Reed, Smith, and
      Vetta in 2004, we present new algorithms and experimental results.
      The worst-case time complexity is improved. Based on new structural
      insights, we give a simplified and more intuitive correctness proof. This
      also allows to establish a heuristic improvement that in particular speeds
      up the search on dense graphs.  Our best algorithm can solve all instances
      from a testbed from computational biology within minutes, whereas
      established methods are only able to solve about half of the instances
      within reasonable time.
  
    Falk Hüffner:Parametrisierte Ansätze für schwere Graphprobleme: Algorithmen und Experimente.
 it – Information Technology, 51(3):171–174, 2009
      (original publication).
    Falk Hüffner,
      Christian Komusiewicz, 
      Hannes Moser, and
      Rolf Niedermeier:Isolation concepts for clique enumeration: comparison and computational experiments.
 Theoretical Computer Science, 410(52):5384–5397, 2009
      (original publication).
    [show abstract]
 Abstract.
      We do computational studies concerning the enumeration of
      isolated cliques in graphs. Isolation, as recently introduced,
      measures the degree of connectedness of the cliques to the rest
      of the graph. Isolation helps both in getting faster algorithms
      than for the enumeration of maximal general cliques and in
      filtering out cliques with special semantics. We compare three
      isolation concepts and their combination with two enumeration
      modi for maximal cliques (“isolated maximal” vs
      “maximal isolated”). All studied concepts exhibit
      the fixed-parameter tractability of the enumeration task with
      respect to the parameter “degree of isolation”.  We
      provide a first systematic experimental study of the
      corresponding enumeration algorithms, using synthetic graphs (in
      the Gn,m,p model),
      financial networks, and a music artist similarity network
      (proposing the enumeration of isolated cliques as a useful
      instrument in analyzing financial and social networks).
   
      Christian Komusiewicz,       
        Falk Hüffner,
        Hannes Moser, and
        Rolf Niedermeier:Isolation concepts for efficiently enumerating dense subgraphs.
 Theoretical Computer Science, 410(38–40):3640–3654, 2009
     (original publication).
    [show abstract]
 Abstract.
      In an undirected graph G = (V, E), a set
      of k vertices is called c-isolated if it has less
      than c · k outgoing edges.  Ito and Iwama [ACM
      Trans. Algorithms, to appear] gave an algorithm to enumerate all
      c-isolated maximal cliques in
      O(4c · c4 ·
      |E|) time. We extend this to enumerating all maximal
      c-isolated cliques (which are a superset) and improve the
      running time bound to O(2.89c · c² ·
      |E|), using modifications which also facilitate
      parallelizing the enumeration.  Moreover, we introduce a more
      restricted and a more general isolation concept and show that
      both lead to faster enumeration algorithms.  Finally, we extend
      our considerations to s-plexes (a relaxation of the
      clique notion), providing a W[1]-hardness result when the size
      of the s-plex is the parameter and a fixed-parameter
      algorithm for enumerating isolated s-plexes when the
      parameter describes the degree of isolation.
 | 
| Book chapters | 
| 
  
    Falk Hüffner,
      Rolf Niedermeier, and
      Sebastian Wernicke:Fixed-parameter algorithms for graph-modeled data clustering.
 In: 
      Clustering Challenges in Biological Networks, chapter 1, pages 3–28,
      World Scientific, 2009
      (original publication).
    [show abstract]
 Abstract.
      Fixed-parameter algorithms can efficiently find optimal
      solutions to some NP-hard problems, including several problems
      that arise in graph-modeled data clustering. This survey
      provides a primer about practical techniques to develop such
      algorithms; in particular, we discuss the design of
      kernelizations (data reductions with provable
      performance guarantees) and depth-bounded search
      trees. Our investigations are circumstantiated by three
      concrete problems from the realm of graph-modeled data
      clustering for which fixed-parameter algorithms have been
      implemented and experimentally evaluated, namely Clique, Cluster
      Editing, and Clique Cover.
 | 
| 2008 | 
| Conference articles | 
| 
  
    Jiong Guo,
      Falk Hüffner,
      Christian Komusiewicz, and
      Yong Zhang:Improved algorithms for bicluster editing.
 In Proceedings of the 5th Annual Conference on Theory and Applications of Models
	  of Computation
      (TAMC ’08), 
      Xian, China. April 2008.
 Volume 4978 in Lecture Notes in Computer Science, pages 445–456, Springer
      (original publication).
 Note: After publication, we have been notified of a
      problem in the proof of Theorem 4 (the factor-4
      approximation). Thus, we retract the results in Section 4. A
      correct factor-4 approximation algorithm was given by Nir Ailon,
      Noa Avigdor-Elgrabli, Edo Liberty, and Anke van Zuylen:
      
	Improved Approximation Algorithms for Bipartite Correlation Clustering.
 [show abstract]
 Abstract.
      The NP-hard Bicluster Editing is to add
      or remove at most k edges to make a bipartite graph a
      vertex-disjoint union of complete bipartite subgraphs. It has
      applications in the analysis of gene expression data. We show
      that by polynomial-time preprocessing, one can shrink a problem
      instance to one with 4k vertices, thus proving that the
      problem has a linear kernel, improving a quadratic kernel
      result. We further give a search tree algorithm that improves
      the running time bound from the trivial
      O(4k + m) to
      O(3.24k + m). Finally, we give a
      randomized 4-approximation, improving a known approximation with
      factor 11.
  
    Falk Hüffner,
      Christian Komusiewicz, 
      Hannes Moser, and
      Rolf Niedermeier:Enumerating isolated cliques in synthetic and financial networks.
 In Proceedings of the 2nd Annual International Conference on Combinatorial Optimization and Applications
      (COCOA ’08), 
      St. John’s, Newfoundland, Canada. August 2008.
 Volume 5165 in Lecture Notes in Computer Science, pages 405–416, Springer
      (original publication).
    Journal version available.
    [show abstract]
 Abstract.
      We do computational studies concerning the enumeration of
      maximal isolated cliques in graphs. Isolation, as recently
      introduced, measures the degree of connectedness of the cliques
      to the rest of the graph. Isolation helps both in getting faster
      algorithms than for the enumeration of maximal general cliques
      and in filtering out cliques with special semantics. Our
      experiments cover synthetic graphs
      (Gn,m,p model) and
      financial networks; our results propose the enumeration of
      isolated cliques as a useful instrument in analyzing financial
      networks.
  
    Falk Hüffner,
      Christian Komusiewicz, 
      Hannes Moser, and
      Rolf Niedermeier:Fixed-parameter algorithms for cluster vertex deletion.
 In Proceedings of the 8th Latin American Theoretical Informatics Symposium
      (LATIN ’08), 
      Búzios, Brazil. April 2008.
 Volume 4957 in Lecture Notes in Computer Science, pages 711–722, Springer
      (original publication).
    Journal version available.
    [show abstract]
 Abstract.
      We initiate the first systematic study of the NP-hard  Cluster Vertex Deletion (CVD) problem
      (unweighted and weighted) in terms of fixed-parameter
      algorithmics. Here one searches for a minimum number of vertex
      deletions to transform a graph into a collection of cliques. The
      parameter denotes the number of vertex deletions. We present
      several efficient fixed-parameter algorithms for CVD. Our
      iterative compression algorithm for CVD seems to be the first
      nontrivial application of this fairly new technique to a problem
      that is not a feedback set problem. Moreover, we study the
      variant of CVD where the number of cliques to be generated is
      prespecified. Here, we detect surprising connections to
      fixed-parameter algorithms for (Weighted)
      Vertex Cover.
  
    Oriana Ponta,
      Falk Hüffner, and
      Rolf Niedermeier:Speeding up dynamic programming for some NP-hard graph recoloring problems.
 In Proceedings of the 5th Annual Conference on Theory and Applications of Models
	  of Computation
      (TAMC ’08), 
      Xian, China. April 2008.
 Volume 4978 in Lecture Notes in Computer Science, pages 490–501, Springer
      (original publication).
    [show abstract]
 Abstract.
  A vertex coloring of a tree is called convex if each color
  induces a connected component. The Convex
  Recoloring problem asks for a minimum-weight change of colors
  to achieve a convex coloring. For the non-uniformly weighted model,
  where the cost of changing a vertex v to color c
  depends on both v and c, we improve the running time
  on trees from O(Δκ ·
  κn) to O(3κ ·
  κn), where Δ is the maximum vertex degree
  of the input graph G, κ is the number of colors,
  and n is the number of vertices in G. In the uniformly
  weighted case, where costs depend only on the vertex to be
  recolored, one can instead parameterize on the number of
  bad colors β ≤ κ, which is the
  number of colors that do not already induce a connected
  component. Here, we improve the running time from
  O(Δβ · βn) to
  O(3β · βn). The
  improvements come from a more fine-grained dynamic programming. For
  the case where the weights are integers bounded by M, using
  fast subset convolution, we further improve the running time with
  respect to the exponential part to
  O(2κ ·
  κ4n2Mlog2(κnM))
  and O(2β ·
  β4n2M
  log2(βnM)) for the non-uniformly weighted and
  uniformly weighted model, respectively. Finally, we use fast subset
  convolution to improve the exponential part of the running time of
  the related 1-Connected Coloring Completion
  problem from O(8k · k +
  2k · kn) to O(4k ·
  k4 log2 k + 2k
  · kn).
 | 
| Journal articles | 
| 
  
    Michael Dom,
      Jiong Guo,
      Falk Hüffner, and
      Rolf Niedermeier:Closest 4-leaf power is fixed-parameter tractable.
 Discrete Applied Mathematics, 156(18):3345–3361, 2008
      (original publication).
    [show abstract]
 Abstract.
      The NP-complete Closest 4-Leaf Power
      problem asks, given an undirected graph, whether it can be
      modified by at most r edge insertions or deletions such
      that it becomes a 4-leaf power. Herein, a 4-leaf power is a
      graph that can be constructed by considering an unrooted
      tree—the 4-leaf root—with leaves one-to-one labeled
      by the graph vertices, where we connect two graph vertices by an
      edge iff their corresponding leaves are at distance at most 4 in
      the tree. Complementing previous work on Closest 2-Leaf Power and Closest 3-Leaf Power, we give the first
      algorithmic result for Closest 4-Leaf
      Power, showing that Closest 4-Leaf
      Power is fixed-parameter tractable with respect to the
      parameter r.
  
    Jens Gramm,
      Jiong Guo,
      Falk Hüffner, and
      Rolf Niedermeier:Data reduction and exact algorithms for clique cover.
 ACM Journal of Experimental Algorithmics, 13, Article 2.2, 15 pages, 2008
      (original publication).
    [show abstract]
 Abstract.
      To cover the edges of a graph with a minimum number of cliques
      is an NP-hard problem with many applications.  We develop for
      this problem efficient and effective polynomial-time data
      reduction rules that, combined with a search tree algorithm,
      allow for exact problem solutions in competitive time. This is
      confirmed by experiments with real-world and synthetic
      data. Moreover, we prove the fixed-parameter tractability of
      covering edges by cliques.
    
    Jiong Guo,
      Falk Hüffner,
      Erhan Kenar,
      Rolf Niedermeier, and
      Johannes Uhlmann:Complexity and exact algorithms for vertex multicut in interval
      and bounded treewidth graphs.
 European Journal of Operational Research,
    186(2):542–553, 2008
      (original publication).
  [show abstract]
 Abstract.
    Multicut is a fundamental network
    communication and connectivity problem.  It is defined as: given
    an undirected graph and a collection of pairs of terminal
    vertices, find a minimum set of edges or vertices whose removal
    disconnects each pair. We mainly focus on the case of removing
    vertices, where we distinguish between allowing or disallowing the
    removal of terminal vertices. Complementing and refining previous
    results from the literature, we provide several NP-completeness
    and (fixed-parameter) tractability results for restricted classes
    of graphs such as trees, interval graphs, and graphs of bounded
    treewidth.
  
    Falk Hüffner,
      Rolf Niedermeier, and
      Sebastian Wernicke:Techniques for Practical Fixed-Parameter Algorithms.
 The Computer Journal,
      51(1):7–25, 2008
      (original publication).
    [show abstract]
 Abstract.  
      The fixed-parameter approach is an algorithm design technique
      for solving combinatorially hard (mostly NP-hard) problems. For
      some of these problems, it can lead to algorithms that are both
      efficient and yet at the same time guaranteed to find optimal
      solutions. Focusing on their application to solving NP-hard
      problems in practice, we survey three main techniques to develop
      fixed-parameter algorithms, namely kernelisation (data reduction
      with provable performance guarantee), depth-bounded search
      trees, and a new technique called iterative compression. Our
      discussion is circumstantiated by several concrete case studies
      and provides pointers to various current challenges in the
      field.
  
    Falk Hüffner,
      Sebastian Wernicke, and
      Thomas Zichner: Algorithm engineering for color-coding with applications to signaling pathway detection.
 Algorithmica,
      52(2):114–132, 2008
      (original publication).
    [show abstract]
 Abstract.
      Color-coding is a technique to design fixed-parameter algorithms for several
      NP-complete subgraph isomorphism problems. Somewhat surprisingly, not much work has
      so far been spent on the actual implementation of algorithms that are based on
      color-coding, despite the elegance of this technique and its wide range of
      applicability to practically important problems. This work gives various novel
      algorithmic improvements for color-coding, both from a worst-case perspective as
      well as under practical considerations. We apply the resulting implementation to the
      identification of signaling pathways in protein interaction networks, demonstrating
      that our improvements speed up the color-coding algorithm by orders of magnitude
      over previous implementations. This allows more complex and larger structures to be
      identified in reasonable time; many biologically relevant instances can even be
      solved in seconds where, previously, hours were required.
     | 
| Book chapters | 
|  | 
| 2007 | 
| Conference articles | 
| 
  
    Falk Hüffner,
      Nadja Betzler, and
      Rolf Niedermeier:Optimal edge deletions for signed graph balancing.
 In Proceedings of the 6th Workshop on Experimental Algorithms
    (WEA ’07),
    Rome, Italy. June 2007.
 Volume 4525 in Lecture Notes in Computer Science, pages 297–310, Springer
      (original publication).
      Journal version available.
    [show abstract]
 Abstract.
      The Balanced Subgraph problem (edge
      deletion variant) asks for a 2-coloring of a graph that
      minimizes the inconsistencies with given edge labels. It has
      applications in social networks, systems biology, and integrated
      circuit design.  We present an exact algorithm for Balanced Subgraph based on a combination of
      data reduction rules and a fixed-parameter algorithm. The data
      reduction is based on finding small separators and a novel
      gadget construction scheme. The fixed-parameter algorithm is
      based on iterative compression with a very effective heuristic
      speedup. Our implementation can solve biological real-world
      instances exactly for which previously only approximations
      [DasGupta et al., WEA 2006] were known.
    
    Falk Hüffner,
      Sebastian Wernicke, and
      Thomas Zichner: Algorithm engineering for color-coding to facilitate signaling pathway detection.
 In Proceedings of the 5th Asia-Pacific Bioinformatics Conference
      (APBC ’07), 
      Hong Kong. January 2007.
 Volume 5 in Advances in Bioinformatics and Computational Biology, pages 277–286,
      Imperial College Press
      (original publication).
      Journal version available.
    [show abstract]
 Abstract.
      Color-coding is a technique to design fixed-parameter algorithms
      for several NP-complete subgraph-isomorphism problems. Somewhat
      surprisingly, not much work has so far been spent on the actual
      implementation of algorithms that are based on color-coding,
      despite the elegance of this technique and its wide range of
      applicability to practically important problems. This work gives
      various novel algorithmic improvements for color-coding, both
      from a worst-case perspective as well as under practical
      considerations. We apply the resulting implementation to the
      identification of signaling pathways in protein interaction
      networks, demonstrating that our improvements speed up the
      color-coding algorithm by orders of magnitude over previous
      implementations. This allows more complex and larger structures
      to be identified in reasonable time; many biologically relevant
      instances can even be solved in seconds where, previously, hours
      were required.
    
    Christian Komusiewicz,
      Falk Hüffner,
      Hannes Moser, and
      Rolf Niedermeier:Isolation concepts for enumerating dense subgraphs.
 In Proceedings of the 13th International Computing and Combinatorics Conference
      (COCOON ’07),
      Banff, Canada. July 2007.
 Volume 4598 in Lecture Notes in Computer Science, pages 140–150, Springer
      (original publication).
      Journal version available.
    [show abstract]
 Abstract.
      In a graph G = (V,E), a vertex subset
      S⊆;V of size k is called
      c-isolated if it has less than c·k outgoing
      edges.  We repair a nontrivially flawed algorithm for
      enumerating all c-isolated cliques due to Ito et al. [ESA
      2005] and obtain an algorithm running in
      O(4c·c4·|E|)
      time. We describe a speedup trick that also helps parallelizing
      the enumeration.  Moreover, we introduce a more restricted and a
      more general isolation concept and show that both lead to faster
      enumeration algorithms.  Finally, we extend our considerations
      to s-plexes (a relaxation of the clique notion),
      providing a W[1]-hardness result and a fixed-parameter algorithm
      for enumerating isolated s-plexes.
 | 
| Journal articles | 
| 
  
    Jens Gramm,
      Jiong Guo,
      Falk Hüffner,
      Rolf Niedermeier,
      Hans-Peter Piepho, and
      Ramona Schmid:Algorithms for compact letter displays: comparison and evaluation.
 Computational Statistics & Data Analysis,
      52(2):725–736, 2007
      (original publication).
    [show abstract]
 Abstract.
      Multiple pairwise comparisons are one of the most frequent tasks
      in applied statistics. In this context, letters
      displays may be used for a compact presentation of results
      of multiple comparisons. A heuristic previously proposed for
      this task is compared with two new algorithmic approaches. The
      latter rely on the equivalence of computing compact letters
      displays to computing clique covers in graphs, a
      problem that is well-studied in theoretical computer science.  A
      thorough discussion of the three approaches aims to give a
      comparison of the algorithms’ advantages and disadvantages.  The
      three algorithms are compared in a series of experiments on
      simulated and real data, e.g., using data from wheat and
      triticale yield trials.
  
      Jiong Guo,
	Falk Hüffner, and
	Hannes Moser:Feedback arc set in bipartite tournaments is NP-complete.
 Information Processing Letters,
      102(2–3): 62–65, 2007
      (original publication).
  [show abstract]
 Abstract.
    The Feedback Arc Set problem asks whether it
    is possible to delete at most k arcs to make a directed graph
    acyclic. We show that Feedback Arc Set is
    NP-complete for bipartite tournaments, that is, directed graphs
    that are orientations of complete bipartite graphs.
  
    Falk Hüffner,
      Sebastian Wernicke, and
      Thomas Zichner:FASPAD: fast signaling pathway detection.
 Bioinformatics
      Applications Note,
    23(13): 1708–1709, 2007
      (original publication).
  [show abstract]
 Abstract.
      Faspad is a user-friendly tool that
      detects candidates for linear signaling pathways in protein
      interaction networks based on an approach by Scott et
      al. [Journal of Computational Biology, 2006]. Using recent
      algorithmic insights, it can solve the underlying NP-hard
      problem quite fast: For protein networks of typical size
      (several thousand nodes), pathway candidates of length up to 13
      proteins can be found within seconds and with a 99.9%
      probability of optimality. Faspad
      graphically displays all candidates that are found; for
      evaluation and comparison purposes, an overlay of several
      candidates and the surrounding network context can also be
      shown.
     
      FASPAD for Linux (i386) or Windows (i386) and
	  as source code | 
 | Theses | 
| 
  
    Falk Hüffner:Algorithms and Experiments for Parameterized Approaches to Hard Graph Problems.
 PhD thesis, Institut für Informatik, Friedrich-Schiller-Universität Jena, 2007.
  [show abstract]
 Abstract.
    This thesis is about the design, analysis, implementation, and
    experimental evaluation of algorithms for hard graph problems. The
    aim is to establish that the concept of fixed-parameter
    tractability, and in particular novel algorithmic techniques whose
    development was driven by this concept, can lead to practically
    useful programs for exactly solving real-world problem instances.
    In particular, we examine graph problems such as Clique Cover, Balanced
    Subgraph, and Minimum-Weight Path,
    which have applications in computational biology and other areas.
    We mainly use the techniques of data reduction, iterative
    compression, and color-coding. We develop a number of new
    fixed-parameter algorithms and give implementations for four
    important problems, all of which are to the best of our knowledge
    the currently fastest solvers.
 | 
	    
	    
| 2006 | 
| Conference articles | 
| 
  
      Matthias Brosemann,
	Jochen Alber,
	Falk Hüffner, and
	Rolf Niedermeier:Matrix robustness, with an application to power system observability.
 In Proceedings of the 2nd Algorithms and Complexity in Durham
      (ACiD ’06), 
      Durham, England. September 2006.
 Volume 7 in Texts in Algorithmics, pages 37–48, College Publications.
  [show abstract]
 Abstract.  
    We initiate the study of the computational complexity of 
    Matrix Robustness: Check whether deleting any k rows from a
    full-rank matrix does not change the matrix rank.  This problem is
    motivated by applications in observability analysis of electrical power
    networks.  We indicate the coNP-completeness of Matrix
    Robustness, provide linear programming based solutions (both exact
    and heuristic) and a problem-specific algorithm, and present encouraging
    experimental results.
  
      Michael Dom,
        Jiong Guo,
	Falk Hüffner,
	Rolf Niedermeier, and
	Anke Truß:Fixed-parameter tractability results for feedback set problems in tournaments.
 In Proceedings of the 6th Conference on Algorithms and Complexity
      (CIAC ’06), 
      Rome, Italy. May 2006.
 Volume 3998 in Lecture Notes in Computer Science, pages 320–331, Springer
      (original publication).
      Journal version available.
  [show abstract]
 Abstract.	  
    Complementing recent progress on classical complexity and polynomial-time
    approximability of feedback set problems in (bipartite) tournaments, we
    extend and partially improve fixed-parameter tractability results for
    these problems.  We show that Feedback Vertex Set
    in tournaments is amenable to the novel iterative compression
    technique. Moreover, we provide data reductions and problem kernels for
    Feedback Vertex Set and Feedback
    Arc Set in tournaments, and a depth-bounded search tree for Feedback Arc Set in bipartite tournaments based on a new
    forbidden subgraph characterization.
  
      Jens Gramm,
	Jiong Guo,
	Falk Hüffner, and
	Rolf Niedermeier:Data reduction, exact, and heuristic algorithms for clique cover.
 In Proceedings of the 8th Workshop on Algorithm Engineering and Experiments
      (ALENEX ’06), 
      Miami, USA. January 2006.
 Pages 86–94, SIAM
      (original publication).
      Journal version available.
  [show abstract]
 Abstract.
    To cover the edges of a graph with a minimum number of cliques is an
    NP-complete problem with many applications. The state-of-the-art solving
    algorithm is a polynomial-time heuristic from the 1970s. We present an
    improvement of this heuristic. Our main contribution, however, is the
    development of efficient and effective polynomial-time data reduction
    rules that, combined with a search tree algorithm, allow for exact problem
    solutions in competitive time. This is confirmed by experiments with
    real-world and synthetic data. Moreover, we prove the fixed-parameter
    tractability of covering edges by cliques.
    
      Jiong Guo,
	Falk Hüffner,
	Erhan Kenar,
	Rolf Niedermeier, and
	Johannes Uhlmann:Complexity and exact algorithms for multicut.
 In Proceedings of the 32nd International Conference on Current Trends in Theory and Practice of Computer Science
      (SOFSEM ’06), 
      Merin, Czech Republic. January 2006.
 Volume 3831 in Lecture Notes in Computer Science, pages 303–312, Springer
      (original publication).
      Journal version available.
 | 
| Journal articles | 
| 
  
	Michael Dom,
        Jiong Guo,
	Falk Hüffner, and
	Rolf Niedermeier:Error compensation in leaf power problems.
 Algorithmica,
      44(4): 363–381, 2006
      (original publication).
  [show abstract]
 Abstract.
    The k-Leaf Root recognition problem is a
    particular case of graph power problems: For a given graph it asks whether
    there exists an unrooted tree—the k-leaf root—with
    leaves one-to-one labeled by the graph vertices and where the leaves have
    distance at mostk iff their corresponding vertices in the graph are
    connected by an edge. Here we study “error correction”
    versions of Leaf Root recognition—that is,
    adding or deleting at most l edges to generate a graph that has a
    k-leaf root. We provide several NP-completeness results in this
    context, and we show that the NP-complete Closest 3-Leaf
    Root problem (the error correction version of 3-Leaf Root) is fixed-parameter tractable with respect
    to the number of edge modifications or vertex deletions in the given
    graph.  Thus, we provide the seemingly first nontrivial positive
    algorithmic results in the field of error compensation for leaf power
    problems with k > 2.  To this end, as a result of independent
    interest, we develop a forbidden subgraph characterization of graphs with
    3-leaf roots.
  
	Jiong Guo,
	Jens Gramm,
	Falk Hüffner,
	Rolf Niedermeier, and
	Sebastian Wernicke:Compression-based fixed-parameter algorithms for Feedback Vertex Set and Edge Bipartization.
 Journal of Computer and System Sciences,
      72(8): 1386–1396, 2006
      (original publication).
  [show abstract]
 Abstract.
    Settling a ten years open question, we show that the NP-complete Feedback Vertex Set problem is deterministically
    solvable in O(ck · m) time, where m
    denotes the number of graph edges, k denotes the size of the
    feedback vertex set searched for, and c is a constant.  As a second
    result, we present a fixed-parameter algorithm for the NP-complete Edge Bipartization problem with runtime
    O(2k · m2).
 | 
| 2005 | 
| Conference articles | 
| 
  Michael Dom,
	Jiong Guo,
	Falk Hüffner, and
	Rolf Niedermeier:Extending the tractability border for closest leaf powers.
 In Proceedings of the 31st International Workshop on
      Graph-Theoretic Concepts in Computer Science
      (WG ’05), 
      Metz, France. June 2005.
 Volume 3787 in Lecture Notes in Computer Science, pages 397–408, Springer
      (original publication).
      Journal version available.
  [show abstract]
 Abstract.	  
    The NP-complete Closest 4-Leaf Power problem asks,
    given an undirected graph, whether it can be modified by at most l
    edge insertions or deletions such that it becomes a 4-leaf power. Herein,
    a 4-leaf power is a graph that can be constructed by considering an
    unrooted tree—the 4-leaf root—with leaves one-to-one labeled
    by the graph vertices, where we connect two graph vertices by an edge iff
    their corresponding leaves are at distance at most 4 in the tree.
    Complementing and “completing” previous work on Closest 2-Leaf Power and Closest 3-Leaf
    Power, we show that Closest 4-Leaf Power is
    fixed-parameter tractable with respect to parameter l.  This gives
    one of the first and so far deepest positive algorithmic results in the
    field of “approximate graph power recognition”—note that
    for 5-leaf powers even the complexity of the exact recognition problem is
    open.
  Jiong Guo,
	Jens Gramm,
	Falk Hüffner,
	Rolf Niedermeier, and
	Sebastian Wernicke:Improved fixed-parameter algorithms for two feedback set problems.
 In Proceedings of the 9th Workshop on Algorithms and Data Structures  
      (WADS ’05),
      Waterloo, Canada. August 2005.
 Volume 3608 in Lecture Notes in Computer Science, pages 158–168, Springer
      (original publication).
      Journal version available.
Falk Hüffner:Algorithm engineering for optimal graph bipartization.
 In Proceedings of the 4th International Workshop on Efficient and 
        Experimental Algorithms
      (WEA ’05), 
      Santorini Island, Greece. May 2005.
 Volume 3503 in Lecture Notes in Computer Science, pages 240–252, Springer
      (original publication).
      Journal version available.
  [show abstract]
 Abstract.
    We examine exact algorithms for the NP-complete Graph
    Bipartization problem that asks for a minimum set of vertices to
    delete from a graph to make it bipartite. Based on the “iterative
    compression” method recently introduced by Reed, Smith, and Vetta,
    we present algorithmic improvements and experimental results.  The
    worst-case time complexity is improved from O(3k ·
    kmn) to O(3k · mn), where n is the
    number of vertices, m is the number of edges, and k is the
    number of vertices to delete. Our best algorithm can solve all problems
    from a testbed from computational biology within minutes, whereas
    established methods are only able to solve about half of the problems
    within reasonable time. | 
		
| Journal articles | 
| 
  
    Jens Gramm,
	Jiong Guo,
	Falk Hüffner, and
	Rolf Niedermeier:Graph-modeled data clustering: Exact algorithms for clique generation.
 Theory of Computing Systems,
      38(4):373–392, 2005
      (original publication).
  [show abstract]
 Abstract.
    We present efficient fixed-parameter algorithms for the NP-complete edge
    modification problems Cluster Editing and Cluster Deletion. Here, the goal is to make the fewest
    changes to the edge set of an input graph such that the new graph is a
    vertex-disjoint union of cliques. Allowing up to k edge additions
    and deletions (Cluster Editing), we solve this
    problem in O(2.27k + |V|3)
    time; allowing only up to k edge deletions (Cluster Deletion), we solve this problem in
    O(1.77k + |V|3) time. The key
    ingredients of our algorithms are two easy to implement bounded search
    tree algorithms and a reduction to a problem kernel of size
    O(k3). This improves and complements previous
    work. Finally, we discuss further improvements on search tree sizes using
    computer-generated case distinctions. | 
| 2004 | 
 | Conference articles | 
| 
  
    Michael Dom,
	Jiong Guo,
	Falk Hüffner, and
	Rolf Niedermeier:Error compensation in leaf root problems.
 In Proceedings of the 15th Annual International Symposium on
        Algorithms and Computation
      (ISAAC ’04),
      Hong Kong, China. December 2004.
 Volume 3341 in Lecture Notes in Computer Science, pages 389–401, Springer
      (original publication).
      Journal version available.
Jiong Guo,
	Falk Hüffner, and
	Rolf Niedermeier:A structural view on parameterizing problems: distance from triviality.
 In Proceedings of the International Workshop on Parameterized and Exact Computation
      (IWPEC ’04),
      Bergen, Norway. September 2004.
 Volume 3162 in Lecture Notes in Computer Science, pages 162–173, Springer
      (original publication).
  [show abstract]
 Abstract.
    Based on a series of known and new examples, we propose the generalized
    setting of “distance from triviality” measurement as a
    reasonable and prospective way of determining useful structural problem
    parameters in analyzing computationally hard problems. The underlying idea
    is to consider tractable special cases of generally hard problems and to
    introduce parameters that measure the distance from these special
    cases. In this paper we present several case studies of distance from
    triviality parameterizations (concerning Clique,
    Power Dominating Set, Set
    Cover, and Longest Common Subsequence) that
    exhibit the versatility of this approach to develop important new views
    for computational complexity analysis. | 
| Journal articles | 
| 
  
    Jens Gramm,
    Jiong Guo,
    Falk Hüffner, and
    Rolf Niedermeier:Automated generation of search tree algorithms for hard graph
      modification problems.
 Algorithmica,
    39(4):321–347, 2004
    (original publication).
  [show abstract]
 Abstract.
    We present a framework for an automated generation of exact search tree
    algorithms for NP-hard problems.  The purpose of our approach is
    two-fold—rapid development and improved upper bounds.  Many search
    tree algorithms for various problems in the literature are based on
    complicated case distinctions. Our approach may lead to a much simpler
    process of developing and analyzing these algorithms. Moreover, using the
    sheer computing power of machines it may also lead to improved upper
    bounds on search tree sizes (i.e., faster exact solving algorithms) in
    comparison with previously developed “hand-made” search trees.
    Among others, such an example is given with the NP-complete Cluster Editing problem (also known as Correlation Clustering on complete unweighted graphs),
    which asks for the minimum number of edge additions and deletions to
    create a graph which is a disjoint union of cliques. The hand-made search
    tree for Cluster Editing had worst-case size
    O(2.27k), which now is improved to
    O(1.92k) due to our new method.  (Herein,
    k denotes the number of edge modifications allowed.) | 
| 2003 | 
 | Conference articles | 
| 
  
    Jens Gramm,
    Jiong Guo,
    Falk Hüffner, and
    Rolf Niedermeier:Graph-modeled data clustering: fixed-parameter algorithms for clique generation.
 In Proceedings of the 5th Italian Conference on Algorithms and Complexity
    (CIAC ’03),
    Rome, Italy. May 2003.
 Volume 2653 in Lecture Notes in Computer Science, pages 108–119, Springer
    (original publication).
    Journal version available.
    Jens Gramm,
    Jiong Guo,
    Falk Hüffner, and
    Rolf Niedermeier:Automated generation of search tree algorithms for graph modification problems.
 In Proceedings of the 11th Annual European Symposium on Algorithms
    (ESA ’03),
    Budapest, Hungary. September 2003.
 Volume 2832 in Lecture Notes in Computer Science, pages 642–653, Springer
    (original publication).
    Journal version available.
 | 
 | Theses | 
| 
  
    Falk Hüffner:Graph Modification Problems and Automated Search Tree Generation.
 Diplomarbeit, Wilhelm-Schickard-Institut für Informatik,
  Universität Tübingen, October 2003.
  [show abstract]
 Abstract.
    We present a framework for the automated generation of exact searchtree
    algorithms for NP-hard problems.  The purpose of our approach is two-fold:
    rapid development and improved upper bounds.
    Many search tree algorithms for various problems in the literature are
    based on complicated case distinctions. Our approach may lead to a much
    simpler process of developing and analyzing these algorithms. Moreover,
    using the sheer computing power of machines it can also provide improved
    upper bounds on search tree sizes (i.e., faster exact solving algorithms)
    in comparison with previously developed “hand-made” search
    trees.  The generated search tree algorithms work by expanding a
    “window view” of the problem with local information, and then
    branch according to a precalculated rule for the resulting window.
    A central example in this work is given with the NP-complete Cluster Editing problem, which asks for the minimum
    number of edge additions and deletions in a graph to create a cluster
    graph (i.e., a graph which is a disjoint union of cliques).  For Cluster Editing, a hand-made search tree is known with
    O(2.27k) worst-case size, which is improved to
    O(1.92k) due to our new method.  (Herein,
    k denotes the number of edge modifications allowed.) A refinement
    of the search tree generator is presented, which tries to direct the
    expansion of the window towards advantageous case sets. The scheme is also
    successfully applied to several other graph modifications problems;
    generalizations to other problems are sketched.
   | 
| 2002 | 
 | Posters | 
|  | 
 | Theses | 
| 
  Falk Hüffner:Finding Optimal Solutions to Atomix.
 Studienarbeit, Wilhelm-Schickard-Institut für Informatik,
  University of Tübingen, December 2002.
  [show abstract]
 Abstract.    
    We present first solutions of benchmark instances to the solitaire
    computer game Atomix found with different heuristic search methods. The
    problem is shown to be NP-hard. An implementation of the heuristic
    algorithm A* is presented that needs no priority queue, thereby having
    very low memory overhead. The limited memory algorithm IDA* is handicapped
    by the fact that, due to move transpositions, duplicates appear very
    frequently in the problem space; several schemes of using memory to
    mitigate this weakness are explored, among those “partial”
    schemes which trade memory savings for a small probability of not finding
    an optimal solution. Even though the underlying search graph is directed,
    backward search is shown to be viable, since the branching factor is the
    same as for forward search. 
   | 
| 2001 | 
 | Conference articles | 
|  |