EFIM: a fast and memory efficient algorithm for high-utility itemset mining

Souleymane Zida, Philippe Fournier-Viger*, Jerry Chun Wei Lin, Cheng Wei Wu, S. Tseng

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

82 Scopus citations


In recent years, high-utility itemset mining has emerged as an important data mining task. However, it remains computationally expensive both in terms of runtime and memory consumption. It is thus an important challenge to design more efficient algorithms for this task. In this paper, we address this issue by proposing a novel algorithm named EFIM (EFficient high-utility Itemset Mining), which introduces several new ideas to more efficiently discover high-utility itemsets. EFIM relies on two new upper bounds named revised sub-tree utility and local utility to more effectively prune the search space. It also introduces a novel array-based utility counting technique named Fast Utility Counting to calculate these upper bounds in linear time and space. Moreover, to reduce the cost of database scans, EFIM proposes efficient database projection and transaction merging techniques named High-utility Database Projection and High-utility Transaction Merging (HTM), also performed in linear time. An extensive experimental study on various datasets shows that EFIM is in general two to three orders of magnitude faster than the state-of-art algorithms d 2HUP, HUI-Miner, HUP-Miner, FHM and UP-Growth+ on dense datasets and performs quite well on sparse datasets. Moreover, a key advantage of EFIM is its low memory consumption.

Original languageEnglish
Pages (from-to)595-625
Number of pages31
JournalKnowledge and Information Systems
Issue number2
StatePublished - 1 May 2017


  • Fast Utility Counting, High-utility database merging and projection
  • Itemset mining, High-utility mining
  • Pattern mining


Dive into the research topics of 'EFIM: a fast and memory efficient algorithm for high-utility itemset mining'. Together they form a unique fingerprint.

Cite this